Abstract

Motor imagery EEG classification plays a crucial role in non-invasive Brain-Computer Interface (BCI) research. However, the performance of classification is affected by the non-stationarity and individual variations of EEG signals. Simply pooling EEG data with different statistical distributions to train a classification model can severely degrade the generalization performance. To address this issue, the existing methods primarily focus on domain adaptation, which requires access to the test data during training. This is unrealistic and impractical in many EEG application scenarios. In this paper, we propose a novel multi-source domain generalization framework called EEG-DG, which leverages multiple source domains with different statistical distributions to build generalizable models on unseen target EEG data. We optimize both the marginal and conditional distributions to ensure the stability of the joint distribution across source domains and extend it to a multi-source domain generalization framework to achieve domain-invariant feature representation, thereby alleviating calibration efforts. Systematic experiments conducted on a simulative dataset, BCI competition IV 2a, 2b, and OpenBMI datasets, demonstrate the superiority and competitive performance of our proposed framework over other state-of-the-art methods. Specifically, EEG-DG achieves average classification accuracies of 81.79% and 87.12% on datasets IV-2a and IV-2b, respectively, and 78.37% and 76.94% for inter-session and inter-subject evaluations on dataset OpenBMI, which even outperforms some domain adaptation methods. Our code is available at https://github.com/zxchit2022/EEG-DG for evaluation.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.