Objective. Daily life tasks can become a significant challenge for motor impaired persons. Depending on the severity of their impairment, they require more complex solutions to retain an independent life. Brain-computer interfaces (BCIs) are targeted to provide an intuitive form of control for advanced assistive devices such as robotic arms or neuroprostheses. In our current study we aim to decode three different executed hand movements in an online BCI scenario from electroencephalographic (EEG) data. Approach. Immersed in a desktop-based simulation environment, 15 non-disabled participants interacted with virtual objects from daily life by an avatar’s robotic arm. In a short calibration phase, participants performed executed palmar and lateral grasps and wrist supinations. Using this data, we trained a classification model on features extracted from the low frequency time domain. In the subsequent evaluation phase, participants controlled the avatar’s robotic arm and interacted with the virtual objects in case of a correct classification. Main results. On average, participants scored online 48% of all movement trials correctly (3-condition scenario, adjusted chance level 40%, alpha = 0.05). The underlying movement-related cortical potentials (MRCPs) of the acquired calibration data show significant differences between conditions over contralateral central sensorimotor areas, which are retained in the data acquired from the online BCI use. Significance. We could show the successful online decoding of two grasps and one wrist supination movement using low frequency time domain features of the human EEG. These findings can potentially contribute to the development of a more natural and intuitive BCI-based control modality for upper limb motor neuroprostheses or robotic arms for people with motor impairments.
Read full abstract