Abstract

Objective. Motor Imagery Brain-Computer Interface (MI-BCI) is an active Brain-Computer Interface (BCI) paradigm focusing on the identification of motor intention, which is one of the most important non-invasive BCI paradigms. In MI-BCI studies, deep learning-based methods (especially lightweight networks) have attracted more attention in recent years, but the decoding performance still needs further improving. Approach. To solve this problem, we designed a filter bank structure with sinc-convolutional layers for spatio-temporal feature extraction of MI-electroencephalography in four motor rhythms. The Channel Self-Attention method was introduced for feature selection based on both global and local information, so as to build a model called Filter Bank Sinc-convolutional Network with Channel Self-Attention for high performance MI-decoding. Also, we proposed a data augmentation method based on multivariate empirical mode decomposition to improve the generalization capability of the model. Main results. We performed an intra-subject evaluation experiment on unseen data of three open MI datasets. The proposed method achieved mean accuracy of 78.20% (4-class scenario) on BCI Competition IV IIa, 87.34% (2-class scenario) on BCI Competition IV IIb, and 72.03% (2-class scenario) on Open Brain Machine Interface (OpenBMI) dataset, which are significantly higher than those of compared deep learning-based methods by at least 3.05% ( = 0.0469), 3.18% ( = 0.0371), and 2.27% ( = 0.0024) respectively. Significance. This work provides a new option for deep learning-based MI decoding, which can be employed for building BCI systems for motor rehabilitation.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.