Abstract

Advances in deep learning have shown great promise towards the application of performing high-accuracy Electroencephalography (EEG) signal classification in a variety of tasks. However, many EEG-based datasets are often plagued by the issue of high inter-subject signal variability. Robust deep learning models are notoriously difficult to train under such scenarios, often leading to subpar or widely varying performance across subjects under the leave-one-subject-out paradigm. Recently, the model agnostic meta-learning framework was introduced as a way to increase the model’s ability to generalize towards new tasks. While the original framework focused on task-based meta-learning, this research aims to show that the meta-learning methodology can be modified towards subject-based signal classification while maintaining the same task objectives and achieve state-of-the-art performance. Namely, we propose the novel implementation of a few/zero-shot subject-independent meta-learning framework towards multi-class inner speech and binary class motor imagery classification. Compared to current subject-adaptive methods which utilize large number of labels from the target, the proposed framework shows its effectiveness in training zero-calibration and few-shot models for subject-independent EEG classification. The proposed few/zero-shot subject-independent meta-learning mechanism performs well on both small and large datasets and achieves robust, generalized performance across subjects. The results obtained shows a significant improvement over the current state-of-the-art, with the binary class motor imagery achieving 88.70% and the accuracy of multi-class inner speech achieving an average of 31.15%. Codes will be made available to public upon publication.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.