Abstract

Emotions play a crucial role in human interaction and healthcare. This study introduces an automatic emotion recognition system based on deep learning using electroencephalogram signals. A lightweight pyramidal one-dimensional convolutional neural network model is proposed that involves a small number of learnable parameters. Using the model, a two-level ensemble classifier is designed. Each channel is scanned incrementally in the first level to generate predictions, which are fused using the majority vote. The second level fuses the predictions of all the channels of a signal using a majority vote to predict the emotional state. The method was validated using the public domain challenging benchmark dataset. The electroencephalogram signals over five brain regions were analyzed. The results indicate that the frontal brain region plays a dominant role in two emotion recognition problems (distinguishing high valence vs. low valence and high arousal vs. low arousal states).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.