Abstract

Background: Realization of online control of an artificial or virtual arm using information decoded from EEG normally occurs by classifying different activation states or voluntary modulation of the sensorimotor activity linked to different overt actions of the subject. However, using a more natural control scheme, such as decoding the trajectory of imagined 3D arm movements to move a prosthetic, robotic, or virtual arm has been reported in a limited amount of studies, all using offline feed-forward control schemes.Objective: In this study, we report the first attempt to realize online control of two virtual arms generating movements toward three targets/arm in 3D space. The 3D trajectory of imagined arm movements was decoded from power spectral density of mu, low beta, high beta, and low gamma EEG oscillations using multiple linear regression. The analysis was performed on a dataset recorded from three subjects in seven sessions wherein each session comprised three experimental blocks: an offline calibration block and two online feedback blocks. Target classification accuracy using predicted trajectories of the virtual arms was computed and compared with results of a filter-bank common spatial patterns (FBCSP) based multi-class classification method involving mutual information (MI) selection and linear discriminant analysis (LDA) modules.Main Results: Target classification accuracy from predicted trajectory of imagined 3D arm movements in the offline runs for two subjects (mean 45%, std 5%) was significantly higher (p < 0.05) than chance level (33.3%). Nevertheless, the accuracy during real-time control of the virtual arms using the trajectory decoded directly from EEG was in the range of chance level (33.3%). However, the results of two subjects show that false-positive feedback may increase the accuracy in closed-loop. The FBCSP based multi-class classification method distinguished imagined movements of left and right arm with reasonable accuracy for two of the three subjects (mean 70%, std 5% compared to 50% chance level). However, classification of the imagined arm movement toward three targets was not successful with the FBCSP classifier as the achieved accuracy (mean 33%, std 5%) was similar to the chance level (33.3%). Sub-optimal components of the multi-session experimental paradigm were identified, and an improved paradigm proposed.

Highlights

  • Brain-computer interface (BCI) research targets movementfree communication between a human user and an electronic device using information encoded in electrophysiological activity of the brain without involving neuromuscular pathways

  • A visual inspection of the results presented in Supplementary Figures 1A1–A3 for most sessions showed that the speed of the imagined identical movements was estimated correctly by the applied motion trajectory prediction (MTP) model in the direction that matched the direction of the imagined movement

  • Only a limited number of studies reported decoding the trajectory of imagined limb movements from EEG, and to the best of the authors’ knowledge, the real-time decoding of imagined 3D arm movement trajectories from EEG has not been studied yet in a closed-loop

Read more

Summary

Introduction

Brain-computer interface (BCI) research targets movementfree communication between a human user and an electronic device using information encoded in electrophysiological activity of the brain without involving neuromuscular pathways. Two significantly different approaches are commonly used to achieve continuous control on electronic devices using non-invasively recorded electrophysiological correlates of motor function (van Gerven et al, 2009), i.e., sensorimotor rhythms (SMR) based multi-class classification and movement/motion trajectory prediction (MTP) (in some applications movement direction classification). The first approach uses SMR based multi-class classification to assign control commands to different cognitive task-specific brain activity patterns, normally using discrete time windows of information during the movement or movement imagery (Pfurtscheller et al, 2006; Morash et al, 2008) to achieve multi-functional control over objects in real (LaFleur et al, 2013) or virtual space (Royer et al, 2010). Using a more natural control scheme, such as decoding the trajectory of imagined 3D arm movements to move a prosthetic, robotic, or virtual arm has been reported in a limited amount of studies, all using offline feed-forward control schemes

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call