Abstract

In order to solve the problem that the emotion recognition rate of single-mode physiological signals is not high in the physiological signals based emotion recognition, in this paper, we propose a convolutional recurrent neural network based method for multi-modal physiological signal emotion recognition task. The method used convolutional neural network to learn the spatial representations of multi-channel EEG signals and the Long Short-term Memory network to learn the temporal representations of peripheral physiological signals (EOG, EMG, GSR, RSP, BVP, and TMP). The two representations are combined for emotion recognition and classification. In the two emotion dimensions of Arousal and Valence, our experiments conducted on the open source dataset DEAP shows that, this method achieve 89.68% and 89.19% average accuracy in the EEG emotion classification, 63.06% and 62.41% average accuracy in the peripheral physiological signal emotion classification, 93.06% and 91.95% average accuracy in the combined feature emotion classification. The experimental results show that the convolutional recurrent neural network based method that we proposed efficiently extract multi-modal physiological signal feature to improve the emotion recognition performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call