Abstract

Brain–machine interfaces are systems that allow the control of a device such as a robot arm through a person’s brain activity; such devices can be used by disabled persons to enhance their life and improve their independence. This paper is an extended version of a work that aims at discriminating between left and right imagined hand movements using a support vector machine (SVM) classifier to control a robot arm in order to help a person to find an object in the environment. The main focus here is to search for the best features that describe efficiently the electroencephalogram data during such imagined gestures by comparing two feature extraction methods, namely the continuous wavelet transform (CWT) and the empirical modal decomposition (EMD), combined with the principal component analysis (PCA) that were fed through a linear and radial basis function (RBF) kernel SVM classifier. The experimental results showed high performance achieving an average accuracy across all the subjects of 92.75% with an RBF kernel SVM classifier using CWT and PCA compared to 80.25% accuracy obtained with EMD and PCA. The proposed system has been implemented and tested using data collected from five male subjects and it enabled the control of the robot arm in the right and the left direction.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.