Abstract

We analyzed the contribution of electroencephalogram (EEG) data, age, sex, and personality traits to emotion recognition processes—through the classification of arousal, valence, and discrete emotions labels—using feature selection techniques and machine learning classifiers. EEG traits and age, sex, and personality traits were retrieved from a well-known dataset—AMIGOS—and two sets of traits were built to analyze the classification performance. We found that age, sex, and personality traits were not significantly associated with the classification of arousal, valence and discrete emotions using machine learning. The added EEG features increased the classification accuracies (compared with the original report), for arousal and valence labels. Classification of arousal and valence labels achieved higher than chance levels; however, they did not exceed 70% accuracy in the different tested scenarios. For discrete emotions, the mean accuracies and the mean area under the curve scores were higher than chance; however, F1 scores were low, implying that several false positives and false negatives were present. This study highlights the performance of EEG traits, age, sex, and personality traits using emotion classifiers. These findings could help to understand the traits relationship in a technological and data level for personalized human-computer interactions systems.

Highlights

  • Emotions influence how people process information and make decisions, and they shape their behavior when they interact with their surroundings

  • The results obtained in this work revealed that none age, sex, or personality had a correlation with arousal and valence labels from the emotional stimuli

  • Compared with self-assessed emotional labels, some demographic characteristics and personality traits were chosen by the feature selection for arousal; for some of the discrete emotions, this might be because the self-assessed responses relied on participants’ subjective emotion assessment

Read more

Summary

Introduction

Emotions influence how people process information and make decisions, and they shape their behavior when they interact with their surroundings. For new human-computer interaction (HCI) paradigms, in which systems are in constant contact with the users, it is important to identify and recognize users’ emotional states to improve interactions between digital systems and the users with high recognition accuracy and provide a more personalized experience [2]. From an HCI perspective, it is important to find new ways in which systems can be more personalized to the user and to achieve better cooperation in fields like assistive and companion computing using physiological signals like electroencephalograms (EEG)—a useful tool that describe how cognition and emotional behavior are related at a physiological level [3,4,5]. It is expected that demographic characteristics and personality traits will foster emotion recognition processes to achieve higher performance

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call