Abstract

Objective. We proposed a brain–computer interface (BCI) based visual-haptic neurofeedback training (NFT) by incorporating synchronous visual scene and proprioceptive electrical stimulation feedback. The goal of this work was to improve sensorimotor cortical activations and classification performance during motor imagery (MI). In addition, their correlations and brain network patterns were also investigated respectively. Approach. 64-channel electroencephalographic (EEG) data were recorded in nineteen healthy subjects during MI before and after NFT. During NFT sessions, the synchronous visual-haptic feedbacks were driven by real-time lateralized relative event-related desynchronization (lrERD). Main results. By comparison between previous and posterior control sessions, the cortical activations measured by multi-band (i.e. alpha_1: 8–10 Hz, alpha_2: 11–13 Hz, beta_1: 15–20 Hz and beta_2: 22–28 Hz) absolute ERD powers and lrERD patterns were significantly enhanced after the NFT. The classification performance was also significantly improved, achieving a ~9% improvement and reaching ~85% in mean classification accuracy from a relatively poor performance. Additionally, there were significant correlations between lrERD patterns and classification accuracies. The partial directed coherence based functional connectivity (FC) networks covering the sensorimotor area also showed an increase after the NFT. Significance. These findings validate the feasibility of our proposed NFT to improve sensorimotor cortical activations and BCI performance during motor imagery. And it is promising to optimize conventional NFT manner and evaluate the effectiveness of motor training.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call