Abstract
Emotion recognition based on electroencephalography (EEG) plays a pivotal role in the field of affective computing, and graph convolutional neural network (GCN) has been proved to be an effective method and made considerable progress. Since the adjacency matrix that can describe the electrode relationships is critical in GCN, it becomes necessary to explore effective electrode relationships for GCN. However, the setting of the adjacency matrix and the corresponding value is empirical and subjective in emotion recognition, and whether it matches the target task remains to be discussed. To solve the problem, we proposed a graph convolutional network with learnable electrode relations (LR-GCN), which learns the adjacency matrix automatically in a goal-driven manner, including using self-attention to forward update the Laplacian matrix and using gradient propagation to backward update the adjacency matrix. Compared with previous works that use simple electrode relationships or only the feature information, LR-GCN achieved higher emotion recognition ability by extracting more reasonable electrode relationships during the training progress. We conducted a subject-dependent experiment on the SEED database and achieved recognition accuracy of 94.72% on the DE feature and 85.24% on the PSD feature. After visualizing the optimized Laplacian matrix, we found that the brain connections related to vision, hearing, and emotion have been enhanced.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual International Conference
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.