Abstract

Emotion recognition based on electroencephalography (EEG) signals is a major area of affective computing. However, the existence of distributional differences between subjects has greatly hindered the large-scale application of EEG emotion recognition techniques. Most of the existing cross-subject methods primarily concentrate on treating multiple subjects as a single source domain. These methods lead to significant distributional differences within the source domain, which hinder the model’s ability to generalise effectively to target subjects. In this paper, we propose a new method that combines graph neural network-based prototype representation of multiple source domains with clustering similarity loss. It consists of three parts: multi-source domain prototype representation, graph neural network and loss. Multi-source domain prototype representation treats different subjects in the source domain as sub-source domains and extracts prototype features, which learns a more fine-grained feature representation. Graph neural network can better model the association properties between prototypes and samples. In addition, we propose a similarity loss based on clustering idea. The loss makes maximum use of similarity between samples in the target domain while ensuring that the classification performance does not degrade. We conduct extensive experiments on two benchmark datasets, SEED and SEED IV. The experimental results validate the effectiveness of the proposed multi-source domain fusion approach and indicate its superiority over existing methods in cross-subject classification tasks.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.