Abstract

Using electroencephalogram (EEG) signal to recognize emotional states has become a research hotspot of affective computing. Previous emotion recognition methods almost ignored the correlation and interaction among multichannel EEG signals, which may provide salient information related to emotional states. This article proposes a novel approach based on rearranged EEG features and deep learning algorithm. In particular, each channel EEG signal is first processed in time domain to get time-domain features. Then, features of all channels are treated as a three-dimensional (3D) feature matrix, according to positions of electrode sensors. This makes the features closer to the real response of the cerebral cortex. Subsequently, an advanced convolutional neural network (CNN) designed with univariate convolution layer and multivariate convolution layer is employed to deal with the 3D feature matrix for emotion recognition. A benchmark dataset for emotion analysis using physiological signal is employed to evaluate this method. The experimental results proved that the 3D feature matrix can effectively represent the emotion-related features in multichannel EEG signals and the proposed CNN can efficaciously mine the unique features of each channel and the correlation among channels for emotion recognition.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call