Abstract
Background and objectiveRecognition of motor intention based on electroencephalogram (EEG) signals has attracted considerable research interest in the field of pattern recognition due to its notable application of non-muscular communication and control for those with severe motor disabilities. In analysis of EEG data, achieving a higher classification performance is dependent on the appropriate representation of EEG features which is mostly characterized by one unique frequency before applying a learning model. Neglecting other frequencies of EEG signals could deteriorate the recognition performance of the model because each frequency has its unique advantages. Motivated by this idea, we propose to obtain distinguishable features with different frequencies by introducing an integrated deep learning model to accurately classify multiple classes of upper limb movement intentions. MethodsThe proposed model is a combination of long short-term memory (LSTM) and stacked autoencoder (SAE). To validate the method, four high-level amputees were recruited to perform five motor intention tasks. The acquired EEG signals were first preprocessed before exploring the consequence of input representation on the performance of LSTM-SAE by feeding four frequency bands related to the tasks into the model. The learning model was further improved by t-distributed stochastic neighbor embedding (t-SNE) to eliminate feature redundancy, and to enhance the motor intention recognition. ResultsThe experimental results of the classification performance showed that the proposed model achieves an average performance of 99.01% for accuracy, 99.10% for precision, 99.09% for recall, 99.09% for f1_score, 99.77% for specificity, and 99.0% for Cohen's kappa, across multi-subject and multi-class scenarios. Further evaluation with 2-dimensional t-SNE revealed that the signal decomposition has a distinct multi-class separability in the feature space. ConclusionThis study demonstrated the predominance of the proposed model in its ability to accurately classify upper limb movements from multiple classes of EEG signals, and its potential application in the development of a more intuitive and naturalistic prosthetic control.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.