Abstract

Facilitating independent living of individuals with upper extremity impairment is a compelling goal for our society. The degree of disability of these individuals could potentially be reduced by using robotic devices that assist their movements in activities of daily living. One approach to control such robotic systems is the use of a brain–computer interface, which detects the user’s intention. This study proposes a method for estimating the user’s intention using electroencephalographic (EEG) signals. The proposed method is capable of discriminating rest from various imagined arm movements, including grasping and elbow flexion. The features extracted from EEG signals are autoregressive model coefficients, root-mean-square amplitude, and waveform length. Support vector machine was used as a classifier, distinguishing class labels corresponding to rest and imagined arm movements. The performance of the proposed method was evaluated using cross-validation. Average accuracies of 91.8 ± 5.8 and 90 ± 4.1 % were obtained for distinguishing rest versus grasping and rest versus elbow flexion. The results show that the proposed scheme provides 18.9, 17.1, and 16.5 % higher classification accuracies for distinguishing rest versus grasping and 21.9, 17.6, and 18.1 % higher classification accuracies for distinguishing rest versus elbow flexion compared with those obtained using filter bank common spatial pattern, band power, and common spatial pattern methods, respectively, which are widely used in the literature.

Highlights

  • In recent years, the use of brain–computer interfaces (BCIs) has been shown to be promising for detecting the users’ intention and controlling robotic devices [1]

  • The results show that the proposed scheme provides 18.9, 17.1, and 16.5 % higher classification accuracies for distinguishing rest versus grasping and 21.9, 17.6, and 18.1 % higher classification accuracies for distinguishing rest versus elbow flexion compared with those obtained using filter bank common spatial pattern, band power, and common spatial pattern methods, respectively, which are widely used in the literature

  • The Elman neural network (ENN) trained by the resilient backpropagation (BP) algorithm was used for the classification of mental tasks, with an accuracy of 86 % obtained [8]

Read more

Summary

Introduction

The use of brain–computer interfaces (BCIs) has been shown to be promising for detecting the users’ intention and controlling robotic devices [1]. EEG signals can be correlated to tasks performed by an individual [3]. Such tasks include mental computation [4], imagining motor movements [5], imagining speech [6], and experiencing emotions [7]. Various classification methods have been proposed for classifying EEG signals. The Elman neural network (ENN) trained by the resilient backpropagation (BP) algorithm was used for the classification of mental tasks, with an accuracy of 86 % obtained [8]. The extracted power of the spectral frequencies has been used for the classification of five mental tasks using a fuzzy classifier [9], with a classification efficiency of 65–100 % obtained. MLP–BP with adaptive autoregression [11] achieved an accuracy of 81.80 %

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call