Abstract

Brain–machine interfaces are systems that allow the control of a device such as a robot arm through a person’s brain activity; such devices can be used by disabled persons to enhance their life and improve their independence. This paper is an extended version of a work that aims at discriminating between left and right imagined hand movements using a support vector machine (SVM) classifier to control a robot arm in order to help a person to find an object in the environment. The main focus here is to search for the best features that describe efficiently the electroencephalogram data during such imagined gestures by comparing two feature extraction methods, namely the continuous wavelet transform (CWT) and the empirical modal decomposition (EMD), combined with the principal component analysis (PCA) that were fed through a linear and radial basis function (RBF) kernel SVM classifier. The experimental results showed high performance achieving an average accuracy across all the subjects of 92.75% with an RBF kernel SVM classifier using CWT and PCA compared to 80.25% accuracy obtained with EMD and PCA. The proposed system has been implemented and tested using data collected from five male subjects and it enabled the control of the robot arm in the right and the left direction.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call