Abstract

Lower-limb exoskeleton robots can support hemiplegic patients’ affected limbs and assist in their rehabilitation. In order to set effective control strategies, it is necessary to obtain the user’s motion intention accurately and timeously. These requirements pose many challenges. The surface electromyography (sEMG) signal has long been used for detecting a person’s locomotion mode. However, most traditional myoelectric motion recognition methods need to collect multi-channel sEMG signals (more than eight) and use feature engineering to obtain the desired accuracy. Additionally, traditional methods make decisions when the recognition probability is still uncertain, resulting in misclassification during continuous recognition. In order to overcome these limitations, a dual-purpose autoencoder-guided temporal convolution network (DA-TCN) is proposed, which is used as a motion detection model for a lower-limb exoskeleton robot based on sEMG. In the proposed method, we first obtain four-channel sEMG signals without any feature engineering. Next, the DA-TCN is used to obtain the discriminative deep features of the sEMG signals. Compared with separable features from common TCNs, discriminative features assist in obtaining a more discriminative twins-loss. Finally, the classification result is redetermined by the decoder, based on the reconstruction of deep features, and misclassifications are rejected based on the value of the twins-loss. The performance of the proposed method was evaluated by obtaining data from seven volunteer subjects in seven locomotion modes. The proposed method was able to effectively detect misclassifications and significantly improved the stability of continuous recognition.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.