Abstract

According to the World Health Organization, stroke is the third leading cause of disability. A common consequence of stroke is hemiparesis, which leads to the impairment of one side of the body and affects the performance of activities of daily living. It has been proven that targeting the motor impairments as early as possible while using wearable mechatronic devices as a robot assisted therapy, and letting the patient be in control of the robotic system, can improve the rehabilitation outcomes. However, despite the increased progress on control methods for wearable mechatronic devices, a need for a more natural interface that allows for better control remains. In this work, a user-independent gesture classification method based on a sensor fusion technique using surface electromyography (EMG) and an inertial measurement unit (IMU) is presented. The Myo Armband was used to extract EMG and IMU data from healthy subjects. Participants were asked to perform 10 types of gestures in 4 different arm positions while using the Myo on their dominant limb. Data obtained from 14 participants were used to classify the gestures using a Multilayer Perceptron Network. Finally, the classification algorithm was tested on 5 novel users, obtaining an average accuracy of 78.94%. These results demonstrate that by using the proposed approach, it is possible to achieve a more natural human machine interface that allows better control of wearable mechatronic devices during robot assisted therapies.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call