Abstract

Graph neural networks (GNNs) have demonstrated efficient processing of graph-structured data, making them a promising method for electroencephalogram (EEG) emotion recognition. However, due to dynamic functional connectivity and nonlinear relationships between brain regions, representing EEG as graph data remains a great challenge. To solve this problem, we proposed a multi-domain based graph representation learning (MD 2 GRL) framework to model EEG signals as graph data. Specifically, MD 2 GRL leverages gated recurrent units (GRU) and power spectral density (PSD) to construct node features of two subgraphs. Subsequently, the self-attention mechanism is adopted to learn the similarity matrix between nodes and fuse it with the intrinsic spatial matrix of EEG to compute the corresponding adjacency matrix. In addition, we introduced a learnable soft thresholding operator to sparsify the adjacency matrix to reduce noise in the graph structure. In the downstream task, we designed a dual-branch GNN and incorporated spatial asymmetry for graph coarsening. We conducted experiments using the publicly available datasets SEED and DEAP, separately for subject-dependent and subject-independent, to evaluate the performance of our model in emotion classification. Experimental results demonstrated that our method achieved state-of-the-art (SOTA) classification performance in both subject-dependent and subject-independent experiments. Furthermore, the visualization analysis of the learned graph structure reveals EEG channel connections that are significantly related to emotion and suppress irrelevant noise. These findings are consistent with established neuroscience research and demonstrate the potential of our approach in comprehending the neural underpinnings of emotion.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.