Abstract
Automatic emotion recognition based on electroencephalogram (EEG) has attracted rapidly increasing interests. Due to large inter-subject variabilities, subject-independent emotion recognition faces great challenges. Recently, domain adaptation methods have been successfully applied in this field due to their ability to align features from different subjects. However, since EEG signals corresponding to some emotions have similar oscillation patterns, they are often confused and aligned to the wrong categories, which limits the generalization ability of the model across subjects. Besides, almost all methods only support offline applications, which require collecting a large number of samples of new subjects. To achieve online recognition, a simpler model is needed. In this paper, a novel Gated Recurrent Unit-Minimum Class Confusion (GRU-MCC) model is proposed. Specifically, a simple feature extractor based on gated recurrent unit (GRU) is firstly applied to model the spatial dependence of multiple electrodes and obtain high-level discriminative features. Then, during training, minimum class confusion (MCC) loss is introduced to reduce the confusion between the correct and ambiguous classes for the target subject and increase the transfer gains. We conduct both offline and online experiments on two public datasets: SEED and MPED. The results indicate that our method can obtain the superior performance.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.