Abstract

Recognizing user intention in reach-to-grasp motions is a critical challenge in rehabilitation engineering. To address this, a Machine Learning (ML) algorithm based on the Extreme Learning Machine (ELM) was developed for identifying motor actions using surface Electromyography (sEMG) during continuous reach-to-grasp movements, involving multiple Degrees of Freedom(DoFs). This study explores feature extraction methods based on time domain and autoregressive models to evaluate ELM performance under different conditions. The experimental setup encompassed variations in neuron size, time windows, validation with each muscle, increase in the number of features, comparison with five conventional ML-based classifiers, inter-subjects variability, and temporal dynamic response. To evaluate the efficacy of the proposed ELM-based method, an openly available sEMG dataset containing data from 12 participants was used. Results highlight the method's performance, achieving Accuracy above 85%, F-score above 90%, Recall above 85%, Area Under the Curve of approximately 84% and compilation times (computational cost) of less than 1ms. These metrics significantly outperform standard methods (p < 0.05). Additionally, specific trends were found in increasing and decreasing performance in identifying specific tasks, as well as variations in the continuous transitions in the temporal dynamics response. Thus, the ELM-based method effectively identifies continuous reach-to-grasp motions through myoelectric data. These findings hold promise for practical applications. The method's success prompts future research into implementing it for more reliable and effective Human-Machine Interface (HMI) control. This can revolutionize real-time upper limb rehabilitation, enabling natural and complex Activities of Daily Living (ADLs) like object manipulation. The robust results encourages further research and innovative solutions to improve people's quality of life through more effective interventions.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.