Abstract

The use of music to address some emotional, cognitive, and physical needs of an individual, as proposed by Music therapy, requires to at least identify the emotional expression that a piece of music evoke in people. The aim of this study is to build a computational system to identify emotions as temporal representations from Electroencephalography (EEG) signals. First, a set of eight pieces of music are composed to theoretically evoke four emotions in people: happiness, sadness, calmness and anger. Then, EEG signals from 30 participants are recorded, first when they remain silence for one minute (baseline) and then while they are listening the stimuli (the eight pieces of music). After listening each stimulus, participants provide a score in terms of Arousal and Valence through the self-assessment manikin test (SAM). The EEGLife dataset is built by reducing noise and cleaning artifacts from raw EEG signals and using the participant’s scores. A set of Self-Organizing Maps (the SOM model) is proposed to classify emotions. A subject-specific training schema with 20% per participant for validation is used not only for the EEGLife dataset, but also for the well-known benchmark DEAP dataset. Cross-correlation and bandpower features are extracted from both datasets to feed the model. A Density Peaks clustering algorithm and a fuzzy system are used to find the final emotional scores for the model. Precision, Recall, F1Score and Accuracy metrics are used to assess the performance of the proposed model. Results show that emotional expression information is extracted easier from EEGLife than DEAP dataset. Finally, the SOM model has an interpretability property, which allows to visually analyze EEG signals in Self-Organizing Maps.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.