In recent years, there has been a growing research interest in using deep learning to resolve the issue of Electroencephalogram (EEG)-based emotion recognition. Current research emphasizes exploiting the useful information from each single EEG channel or each individual set of multi-channel EEG, but overlooks the correlation information among different multi-channel EEG sets. To explore such discriminative correlation information, we propose a novel and effective method“, Trainable Adjacency Relation Driven Graph Convolutional Network (TARDGCN)", which contains two complementary modules, TAR and GCN. TAR optimizes the local pair-wise positions of multi-channel EEG sets, which helps form an improved graphic representation for GCN to learn the global correlation among these sets for classification. The proposed method is capable of dealing with the problem of small sample size but large data variation in this issue. Our experimental results conducted on the databases DREAMER and DEAP in the subject-dependent and subject-independent modes show that TARDGCN outperforms the state-of-the-art approaches in classifying all of valence, arousal, and dominance.
Read full abstract