Abstract

A better understanding of ElectroEncephaloGraphy (EEG) and ElectroCorticoGram (ECoG) signals would get us closer to comprehending brain functionality, creating new avenues for treating brain abnormalities and developing novel Brain-Computer Interface (BCI)-related applications. Deep Neural Networks (DNN)s have lately been employed with remarkable success to decode EEG/ECoG signals for BCI. However, the optimal architectural/training parameter values in these DNN architectures have yet to receive much attention. In addition, new data-driven optimization methodologies that leverage significant advancements in Machine Learning, such as the Transformer model, have recently been proposed. Because an exhaustive search on all possible architectural/training parameter values of the state-of-the-art DNN model (our baseline model) decoding the motor imagery EEG and finger tension ECoG signals comprising the BCI IV 2a and 4 datasets, respectively, would require prohibitively much time, this paper proposes an offline model-based optimization technique based on the Transformer model for the discovery of the optimal architectural/training parameter values for that model. Our findings indicate that we could pick better values for the baseline model’s architectural/training parameters, enhancing the baseline model’s performance by up to 14.7% in the BCI IV 2a dataset and by up to 61.0% in the BCI IV 4 dataset.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.