Background and objective: Electroencephalogram (EEG) emotion recognition is increasingly popular for visualizing and quantifying brain networks, with potential applications in medical cognition and human–computer interaction. However, single brain space is limited, and multi-channel recognition poses challenges with calculation volume and information redundancy. Therefore, this paper focuses on studying the affective brain–computer interface (aBCI) and critical lead acquisition of the fusion brain network. Method: This paper employs two correlation algorithms to build functional and causal networks. A fusion brain network (FBN) was proposed to capture emotional information in the spatial domain. Additionally, a deep network recognition model (FBN-TCN) is introduced to capture the dynamic changes of spatial domain emotional information over time. A critical channel combination with high emotional correlation among multiple channels is obtained by the brain causal networks. All proposed models were performed on the SEED dataset through subject-dependent and subject-independent experiments. Results: Experimental results demonstrate that FBN can cover more emotional information to improve emotional recognition. The critical channel aBCI achieves comparable recognition performance to the full-channel aBCI, with recognition results of 95.51%/3.15% and 96.34%/1.62% in the Total band, respectively. The generalization and effectiveness of the model were verified through ablation experiments, which involved subject-dependent and subject-independent emotion recognition of DEAP, and emotion recognition of SEED on model block transformation. Conclusion and significance: This paper can simplify the electrode cap device and maximizes the utilization of EEG spatial domain emotional information. It is highly significant for enhancing the performance of aBCI in human–computer interaction.
Read full abstract