Abstract

Electroencephalography (EEG) is the most prevalent signal acquisition technique for brain-computer interface (BCI). However, the statistical distribution of EEG data varies across subjects and sessions, resulting in poor generalization of the domain-specific classifier. Although the collection of a large number of recordings may alleviate this issue, it is often impractical and not user-friendly. This study proposes the integration of deep domain adaptation with few-shot learning to address the challenge by leveraging the knowledge from multiple source subjects to enhance the performance of a single target subject. The framework incorporated 3 modules: a feature extractor, domain discriminator, and classifier. The feature extractor utilized the available labeled samples with supervised contrastive loss to map the discriminate features onto a deep representation space, where the features from the same class were more similar than those from different classes. The domain discriminator was used to reduce domain drift, through adversarial training. The classifier predicted the user motor intention, based on EEG features. The framework was extensively evaluated through the BCI Competition IV Datasets 2a and 2b. The results of this study indicate that the framework is capable of enhancing the BCI performance and potentially decreases the calibration effort compared to the traditional approach, but the major limitation of this method is that it requires meticulous selection of source subjects.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.