Abstract
Surface electromyogram (sEMG)-based hand gesture recognition, which interprets commands given by humans through sEMG signals, performs well in many studies. However, its recognition accuracy drops dramatically due to electrode shift since the distributions of motion classes are changed. Although calibrating the system with newly collected samples after electrode shift maintains the accuracy, collecting labeled samples is inconvenient and time-consuming since the procedure is rigid. However, the calibration may not work properly without label, especially when the change is significant. This study proposes a user friendly and convenient calibration method for hand gesture recognition by an unsupervised domain adaptation method, which only obtains the unlabeled samples of preselected benchmark classes from users in calibration. The change of benchmark classes is captured by unlabeled samples by a clustering method. The other classes are estimated based on the benchmark classes by regression models. As a result, the information of all classes is used to calibrate the system. Linear discriminant analysis is used to demonstrate our model. A dataset with ten subjects is collected to verify the performance empirically. Experimental results confirm that our method utilizes the unlabeled benchmark class samples in calibration and achieves 75.55% average accuracy. Our method is more robust to electrode shift and improves around 8.5% accuracy consistently on all subjects compared with the methods without calibration or label information in calibration. Although the accuracy of our method is slightly less than the ones using label calibration samples, our calibration data collection is more convenient and less complicated.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.