Abstract

Electroencephalogram (EEG)-based automatic emotion recognition technologies are gaining significant attention and have become crucial in the field of brain–computer interfaces (BCIs). In particular, deep learning methods have been widely used in emotion recognition in recent years. However, most existing methods generally tend to focus on EEG spatiotemporal information, which ignores the potential relationships between brain activity signals and the differences in functional connectivity under different emotions. Here, we raise a functional connectivity-enhanced feature-grouped attention network (FC-FAN) for cross-subject emotion recognition. The FC-FAN model developed is a dual-input model. One input consists of the differential entropy data derived from the original EEG signals, while the other input comprises the functional connectivity data obtained through the calculation of the phase synchronization index. Then the primary EEG features of the two groups’ input data are extracted through two specific residual blocks. Next, the designed time-series feature grouped attention module (TFGAM) and functional connectivity feature grouped attention model (F2GAM) are utilized to mark interested information or suppress uninterested features for the two groups’ features, respectively. Finally, generated information interacts through a fusion operator. The designed framework could not only sufficiently learn the spatiotemporal features of EEG signals but also clearly analyze nonlinear correlations between electrode signals. Comprehensive tests confirm that the FC-FAN has an excellent effect on subject-independent emotion recognition tasks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call