Abstract
The surface electromyography (EMG) signal reflects the user's intended actions and has become the important signal source for human-computer interaction. However, classification models trained on EMG signals from the same day cannot be applied for different days due to the time-varying characteristics of the EMG signal and the influence of electrodes shift caused by device wearing for different days, which hinders the application of commercial prosthetics. This type of gesture recogni-tion for different days is usually referred to as long-term gesture recognition.
Approach. To address this issue, we propose a long-term gesture recognition method by optimizing feature extraction, dimensionality reduction, and classification model calibration in EMG signal recognition. Our method extracts differential common spa-tial patterns (CSP) features and then conduct dimensionality reduction with non-negative matrix factorization (NMF), effectively reducing the influence of the non-stationarity of the EMG signals. Based on clustering and classification self-training(CCST) scheme, we select samples with high confidence from unlabeled sam-ples to adaptively updates the model before daily formal use. 
Main results. We verify the feasibility of our method on a dataset consisting of 30 days of gesture data. The proposed gesture recognition scheme achieves accuracy over 90%, similar to the performance of daily calibration with labeled data. However, our method needs only one repetition of unlabeled gestures samples to update the classifi-cation model before daily formal use. 
Significance. From the results we can conclude that the proposed method can not only ensure superior performance, but also greatly facilitate the daily use, which is especially suitable for long-term application.
.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Similar Papers
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.