Abstract

Human emotions are an important part in daily life. In this paper, a novel multilayer network-based convolutional neural network (CNN) model is proposed for emotion recognition, from multi-channel nonlinear EEG signals. Firstly, in response to the multi-rhythm properties of brain, a multilayer brain network with five rhythm-based layers are derived, where each layer can pertinently describe one specific frequency band. Subsequently, a novel CNN model is carefully designed, which uses the multilayer brain network as input and allows deep learning of the classifiable nonlinear features from the channel and frequency views. Moreover, one DenseNet model is developed as another branch to study time-domain nonlinear features from the EEG signals. All the learned features are eventually concatenated together for emotion recognition. Publicly available SEED dataset is used to test the proposed method, and it shows good results on all 15 subjects, with average accuracy of 91.31%. Our method builds a bridge between the multilayer network and deep learning, suggesting an effective approach for analyzing multivariate nonlinear time series, especially multi-channel EEG signals.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call