Abstract

Non-stationarity of EEG signals leads to high variability between subjects, making it challenging to directly use data from other subjects (source domain) for the classifier in the current subject (target domain). In this study, we propose MI-DAGSC to address domain adaptation challenges in EEG-based motor imagery (MI) decoding. By combining domain-level information, class-level information, and inter-sample structure information, our model effectively aligns the feature distributions of source and target domains. This work is an extension of our previous domain adaptation work MI-DABAN (Li et al., 2023). Based on MI-DABAN, MI-DAGSC designs Sample-Feature Blocks (SFBs) and Graph Convolution Blocks (GCBs) to focus on intra-sample and inter-sample information. The synergistic integration of SFBs and GCBs enable the model to capture comprehensive information and understand the relationship between samples, thus improving representation learning. Furthermore, we introduce a triplet loss to enhance the alignment and compactness of feature representations. Extensive experiments on real EEG datasets demonstrate the effectiveness of MI-DAGSC, confirming that our method makes a valuable contribution to the MI-EEG decoding. Moreover, it holds great potential for various applications in brain–computer interface systems and neuroscience research. And the code of the proposed architecture in this study is available under https://github.com/zhangdx21/MI-DAGSC.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.