Abstract

In this study, a few-shot transfer learning approach was introduced to decode movement intention from electroencephalographic (EEG) signals, allowing to recognize new tasks with minimal adaptation. To this end, a dataset of EEG signals recorded during the preparation of complex sub-movements was created from a publicly available data collection. The dataset was divided into two parts: the source domain dataset (including 5 classes) and the support (target domain) dataset, (including 2 classes) with no overlap between the two datasets in terms of classes. The proposed methodology consists in projecting EEG signals into the space-frequency-time domain, in processing such projections (rearranged in channels × frequency frames) by means of a custom EEG-based deep neural network (denoted as EEGframeNET5), and then adapting the system to recognize new tasks through a few-shot transfer learning approach. The proposed method achieved an average accuracy of 72.45±4.19% in the 5-way classification of samples from the source domain dataset, outperforming comparable studies in the literature. In the second phase of the study, a few-shot transfer learning approach was proposed to adapt the neural system and make it able to recognize new tasks in the support dataset. The results demonstrated the system's ability to adapt and recognize new tasks with an average accuracy of 80±0.12% in discriminating hand opening/closing preparation and outperforming reported results in the literature. This study suggests the effectiveness of EEG in capturing information related to the motor preparation of complex movements, potentially paving the way for BCI systems based on motion planning decoding. The proposed methodology could be straightforwardly extended to advanced EEG signal processing in other scenarios, such as motor imagery or neural disorder classification.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call