Abstract

Recently, the combination of neural network and attention mechanism is widely employed for electroencephalogram (EEG) emotion recognition (EER) and has achieved remarkable results. Nevertheless, most of them ignored the individual information in and within different frequency bands, so they just applied a single-layer attention mechanism to the entire EEG signals, with relatively single feature expression. To overcome the shortcoming, a spatial-frequency convolutional self-attention network (SFCSAN) is proposed in this paper to integrate the feature learning from both spatial and frequency domain of EEG signals. In this model, the intra-frequency band self-attention is employed to learn frequency information from each frequency band, and inter-frequency band mapping further maps them into final attention representation to learn their complementary frequency information. Additionally, a parallel convolutional neural network (PCNN) layer is used to excavate the spatial information of EEG signals. By incorporating spatial and frequency band information, the SFCSAN can fully utilize the spatial and frequency domain information of EEG signals for emotion recognition. The experiments conducted on two public EEG emotion datasets achieved the average accuracy of 95.15%/95.76%/95.64%/95.86% on valence/arousal/dominance/liking label for DEAP dataset, and 93.77%/95.80%/96.26% on valence/arousal/dominance label for DREAMER dataset, which all demonstrate that the proposed method is conducive to enhancing the importing of emotion-salient information and generating better recognition performance. The code of our work is available on “https://github.com/qeebeast7/SFCSAN”.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.