Abstract

Electroencephalogram (EEG)-based emotion computing has become a hot topic of brain-computer fusion. EEG signals have inherent temporal and spatial characteristics. However, existing studies did not fully consider the two properties. In addition, the position encoding mechanism in the vanilla transformer cannot effectively encode the continuous temporal character of the emotion. A temporal relative (TR) encoding mechanism is proposed to encode the temporal EEG signals for constructing the temporality self-attention in the transformer. To explore the contribution of each EEG channel corresponding to the electrode on the cerebral cortex to emotion analysis, a channel-attention (CA) mechanism is presented. The temporality self-attention mechanism cooperates with the channel-attention mechanism to utilize the temporal and spatial information of EEG signals simultaneously by preprocessing. Exhaustive experiments are conducted on the DEAP dataset, including the binary classification on valence, arousal, dominance, and liking. Furthermore, the discrete emotion category classification task is also conducted by mapping the dimensional annotations of DEAP into discrete emotion categories (5-class). Experimental results demonstrate that our model outperforms the advanced methods for all classification tasks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call