Abstract

Electroencephalogram signals (EEG) has been widely used in emotion recognition because of its authenticity and unforgeability. Therefore, EEG emotion recognition has become one of the main technologies of emotion computing. EEG signals are composed of complex time domain, frequency domain and spatial domain (TFS) related information. Aiming at the problems of insufficient mining of TFS feature information and low recognition rate in EEG emotion recognition. This paper presents a Multi-Task Joint Neural Network (MT-2DCNN-LSTM) model constructed by two-dimensional convolutional neural network (2DCNN) and long short-term memory neural network (LSTM). In this paper, frequency domain and spatial domain features are used to construct 3D feature matrix graph, and time domain features are used to construct 2D sequence information. Then these two features are used as input of the model to fully extract the TFS feature information of EEG signals. In order to verify the recognition ability of the model for EEG signals, a multivariate classification experiment was carried out on the DEAP dataset, a well-known dataset for comparison purposes. Among them, the average accuracy of emotion recognition of arousal and valence is 97.29% and 97.72%, respectively. The results show that MT-2DCNN-LSTM has excellent performance.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.