Abstract

Recently, Convolutional Neural Networks (CNNs) have been used for the classification of hand activities from surface Electromyography (sEMG) signals. However, sEMG signal has spatial sparsity due to position of electrodes on hand muscle and temporal dependency due to performance of activity over a period of time. The CNN has the ability to extract spatial features and is limited in extracting temporal dependencies. Whereas, the Long Short-Term Memory (LSTM) aims to encode the temporal relations from sequential data. Hence, in this paper, we propose a hybrid CNN and Bidirectional LSTM (Bi-LSTM) based EMGHandNet architecture to encode the inter-channel and temporal dependencies of sEMG signals for hand activity classification. First, the CNN layers are used to extract deep features from sEMG signals, then these feature maps are processed by the Bi-LSTM to extract the sequential information in both the forward and backward directions. Thus, the proposed model learns both inter-channel and bidirectional temporal information in an end-to-end manner. The proposed model is trained and tested on five benchmark datasets, including the NinaPro DB1, NinaPro DB2, NinaPro DB4, BioPatRec DB2 and UCI Gesture. The average classification accuracies for the NinaPro DB1, NinaPro DB2, NinaPro DB4 and UCI Gesture are 95.77%,95.9%,91.65%, and 98.33% respectively. They correspond to an improvement of 4.42%,12.2%,18.65% and 1.33% over the respective state-of-the-art models. Moreover, for the BioPatRec DB2 dataset, a comparable performance (91.29%) is observed. The experimental results and comparisons confirm the superiority of the proposed model for hand activity classification from the sEMG signals.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.