Abstract

When applying continuous motion estimation (CME) model based on sEMG to human-robot system, it is inevitable to encounter scenarios in which the motions performed by the user are different from the motions in the training stage of the model. It has been demonstrated that the prediction accuracy of the currently effective approaches on untrained motions will be significantly reduced. Therefore, we proposed a novel CME method by introducing muscle synergy as feature to achieve better prediction accuracy on untrained motion tasks. Specifically, deep non-smooth NMF (Deep-nsNMF) was firstly introduced on synergy extraction to improve the efficiency of synergy decomposition. After obtaining the activation primitives from various training motions, we proposed a redundancy classification algorithm (RC) to identify shared and task-specific synergies, optimizing the original redundancy segmentation algorithm (RS). NARX neural network was set as the regression model for training. Finally, the model was tested on prediction tasks of eight untrained motions. The prediction accuracy of the proposed method was found to perform better than using time-domain feature as input of the network. Through Deep-nsNMF with RS, the highest accuracy reached 99.7%. Deep-nsNMF with RC performed similarly well and its stability was relatively higher among different motions and subjects. Limitation of the approach is that the problem of positive correlation between the prediction error and the absolute value of real angle remains to be further addressed. Generally, this research demonstrates the potential for CME models to perform well in complex scenarios, providing the feasibility of dedicating CME to real-world applications.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.