Abstract

In this study, a novel control algorithm, based on a P300-based brain-computer interface (BCI) is deployed to control a 2-DoF robotic arm. Eight subjects, including five men and three women, perform a 2-dimensional target tracking in a simulated environment. Their EEG (Electroencephalography) signals from the visual cortex are recorded and P300 components are extracted and evaluated to deliver a real-time BCI-based controller. The volunteer’s intention is recognized and will be decoded as an appropriate command to control the cursor. The final processed BCI output is used to control a simulated robotic arm in a 2-dimensional space. The results show that the system allows the robot’s end-effector to move between arbitrary positions in a point-to-point session with the desired accuracy. This model is tested and compared on the Dataset II of the BCI competition. The best result is obtained with a multi-classifier solution with a recognition rate of 97 percent, without channel selection before the classification.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call