Abstract

Emotion recognition is an increasingly relevant field due to its direct implications for various sectors of society. The area aims to enhance the understanding of how emotions influence human behavior. Exploring brain activity analysis through electroencephalogram signals becomes possible when considering that emotions can manifest non-verbally. In this scenario, machine learning applications prove promising due to the complexity of recognizing emotions from electrical signal data from the brain. The case study focuses on DEAP, a recognized dataset constructed through experiments in electroencephalography, exposing subjects to musical and visual stimuli. The main objective of this work is to present a pipeline for the classification of emotions based on images of topographic maps generated from the EEGLAB tool and electroencephalogram signals. Additionally, the contributions of this work include the presentation of a structured dataset created through the mapping of temporal, spatial, and frequency data derived from topographic images and models for predicting dimensional emotions of arousal and valence based on the new dataset. Results demonstrate accuracies of 85.46% and 85.05% for the classification of low/high arousal and valence emotions, respectively.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call