Abstract

EEG emotion recognition is an essential area of brain-computer interface(BCI). Because of the low signal-to-noise ratio (SNR) and the uncertainty of the relationship between channels, it is arduous to mine the spatial and temporal information of EEG, especially through a single data representation method. Nowadays, several studies have applied knowledge distillation to the field of emotion recognition. However, traditional knowledge distillation requires a more powerful teacher model, which is time-consuming and needs massive storage space. In order to solve the above problems, in this paper, we propose a novel deep spatio-temporal mutual learning architecture named MLBNet for EEG emotion recognition, which is composed of temporal biased feature learner and spatial biased feature learner. The two components can learn well from chain-like data and matrix-like data respectively, and are trained collaboratively to mimic the predicted probability of each other. By the proposed architecture, we can improve the performance of EEG emotion recognition simply and effectively. To evaluate the validity of proposed method, we performed subject-dependent binary-class and four-class emotion identification tasks on DEAP dataset. The average result of the 10-fold cross-validation is considered as the final result. The MLBNet achieves 98.72% accuracy on valence and 98.85% accuracy on arousal respectively, and 98.32% accuracy on four-class classification tasks. To our best knowledge, our model demonstrates a better performance than the state-of-the-art models with the identical settings.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call