Abstract

A robust decoding model that can efficiently deal with the subject and period variation is urgently needed to apply the brain-computer interface (BCI) system. The performance of most electroencephalogram (EEG) decoding models depends on the characteristics of specific subjects and periods, which require calibration and training with annotated data prior to application. However, this situation will become unacceptable as it would be difficult for subjects to collect data for an extended period, especially in the rehabilitation process of disability based on motor imagery (MI). To address this issue, we propose an unsupervised domain adaptation framework called iterative self-training multisubject domain adaptation (ISMDA) that focuses on the offline MI task. First, the feature extractor is purposefully designed to map the EEG to a latent space of discriminative representations. Second, the attention module based on dynamic transfer matches the source domain and target domain samples with a higher coincidence degree in latent space. Then, an independent classifier oriented to the target domain is employed in the first stage of the iterative training process to cluster the samples of the target domain through similarity. Finally, a pseudolabel algorithm based on certainty and confidence is employed in the second stage of the iterative training process to adequately calibrate the error between prediction and empirical probabilities. To evaluate the effectiveness of the model, extensive testing has been performed on three publicly available MI datasets, the BCI IV IIa, the High gamma dataset, and Kwon et al. datasets. The proposed method achieved 69.51%, 82.38%, and 90.98% cross-subject classification accuracy on the three datasets, which outperforms the current state-of-the-art offline algorithms. Meanwhile, all results demonstrated that the proposed method could address the main challenges of the offline MI paradigm.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.