Abstract

Most electroencephalography (EEG)-based emotion recognition systems rely on a single stimulus to evoke emotions. These systems make use of videos, sounds, and images as stimuli. Few studies have been found for self-induced emotions. The question “if different stimulus presentation paradigms for same emotion, produce any subject and stimulus independent neural correlates” remains unanswered. Furthermore, we found that there are publicly available datasets that are used in a large number of studies targeting EEG-based human emotional state recognition. Since one of the major concerns and contributions of this work is towards classifying emotions while subjects experience different stimulus-presentation paradigms, we need to perform new experiments. This paper presents a novel experimental study that recorded EEG data for three different human emotional states evoked with four different stimuli presentation paradigms. Fear, neutral, and joy have been considered as three emotional states. In this work, features were extracted with common spatial pattern (CSP) from recorded EEG data and classified through linear discriminant analysis (LDA). The considered emotion-evoking paradigms included emotional imagery, pictures, sounds, and audio–video movie clips. Experiments were conducted with twenty-five participants. Classification performance in different paradigms was evaluated, considering different spectral bands. With a few exceptions, all paradigms showed the best emotion recognition for higher frequency spectral ranges. Interestingly, joy emotions were classified more strongly as compared to fear. The average neural patterns for fear vs. joy emotional states are presented with topographical maps based on spatial filters obtained with CSP for averaged band power changes for all four paradigms. With respect to the spectral bands, beta and alpha oscillation responses produced the highest number of significant results for the paradigms under consideration. With respect to brain region, the frontal lobe produced the most significant results irrespective of paradigms and spectral bands. The temporal site also played an effective role in generating statistically significant findings. To the best of our knowledge, no study has been conducted for EEG emotion recognition while considering four different stimuli paradigms. This work provides a good contribution towards designing EEG-based system for human emotion recognition that could work effectively in different real-time scenarios.

Highlights

  • IntroductionEmotional state recognition plays an important role in the research area of human–

  • Various studies have been conducted to find how the EEG signals correlate to human emotions [13,14,15].While reviewing the literature, we found that most of the EEG emotion recognition-based studies have used a single method to elicit emotions [13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47]

  • We have presented a novel dataset that records EEG data for fear, neutral, and joy human emotional states evoked with four different stimuli presentation paradigms; Identification of the most relevant spectral bands and brain regions with respect to each paradigm; common spatial pattern (CSP) has been widely used in different scenarios of EEG-based BCI applications such as motor imagery; this work attempts to investigate if it is a good choice for emotion recognition

Read more

Summary

Introduction

Emotional state recognition plays an important role in the research area of human–. The ability to identify a person’s emotional state based on relatively acquired scalp electroencephalographic (EEG) data could be of clinical importance for anger management, depression, anxiety, or stress reduction, especially for persons with communication disabilities. Different categories of emotional states are identified as fear, disgust, pride, happiness, anger, etc. The question as to “whether different stimuli of elicitation for same emotion generate any subject and stimuli independent correlates”

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call