Abstract

Motor brain-computer interface (BCI) can intend to restore or compensate for central nervous system functionality. In the motor-BCI, motor execution (ME), which relies on patients' residual or intact movement functions, is a more intuitive and natural paradigm. Based on the ME paradigm, we can decode voluntary hand movement intentions from electroencephalography (EEG) signals. Numerous studies have investigated EEG-based unimanual movement decoding. Moreover, some studies have explored bimanual movement decoding since bimanual coordination is important in daily-life assistance and bilateral neurorehabilitation therapy. However, the multi-class classification of the unimanual and bimanual movements shows weak performance. To address this problem, in this work, we propose a neurophysiological signatures-driven deep learning model utilizing the movement-related cortical potentials (MRCPs) and event-related synchronization/ desynchronization (ERS/D) oscillations for the first time, inspired by the finding that brain signals encode motor-related information with both evoked potentials and oscillation components in ME. The proposed model consists of a feature representation module, an attention-based channel-weighting module, and a shallow convolutional neural network module. Results show that our proposed model has superior performance to the baseline methods. Six-class classification accuracies of unimanual and bimanual movements achieved 80.3%. Besides, each feature module of our model contributes to the performance. This work is the first to fuse the MRCPs and ERS/D oscillations of ME in deep learning to enhance the multi-class unimanual and bimanual movements' decoding performance. This work can facilitate the neural decoding of unimanual and bimanual movements for neurorehabilitation and assistance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call