Abstract
Objective.Due to the difficulty in acquiring motor imagery electroencephalography (MI-EEG) data and ensuring its quality, insufficient training data often leads to overfitting and inadequate generalization capabilities of deep learning-based classification networks. Therefore, we propose a novel data augmentation method and deep learning classification model to enhance the decoding performance of MI-EEG further.Approach.The raw EEG signals were transformed into the time-frequency maps as the input to the model by continuous wavelet transform. An improved Wasserstein generative adversarial network with gradient penalty data augmentation method was proposed, effectively expanding the dataset used for model training. Additionally, a concise and efficient deep learning model was designed to improve decoding performance further.Main results.It has been demonstrated through validation by multiple data evaluation methods that the proposed generative network can generate more realistic data. Experimental results on the BCI Competition IV 2a and 2b datasets and the actual collected dataset show that classification accuracies are 83.4%, 89.1% and 73.3%, and Kappa values are 0.779, 0.782 and 0.644, respectively. The results indicate that the proposed model outperforms state-of-the-art methods.Significance.Experimental results demonstrate that this method effectively enhances MI-EEG data, mitigates overfitting in classification networks, improves MI classification accuracy, and holds positive implications for MI tasks.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.