Abstract
Most current approaches to emotion recognition are based on neural signals elicited by affective materials such as images, sounds and videos. However, the application of neural patterns in the recognition of self-induced emotions remains uninvestigated. In this study we inferred the patterns and neural signatures of self-induced emotions from electroencephalogram (EEG) signals. The EEG signals of 30 participants were recorded while they watched 18 Chinese movie clips which were intended to elicit six discrete emotions, including joy, neutrality, sadness, disgust, anger and fear. After watching each movie clip the participants were asked to self-induce emotions by recalling a specific scene from each movie. We analyzed the important features, electrode distribution and average neural patterns of different self-induced emotions. Results demonstrated that features related to high-frequency rhythm of EEG signals from electrodes distributed in the bilateral temporal, prefrontal and occipital lobes have outstanding performance in the discrimination of emotions. Moreover, the six discrete categories of self-induced emotion exhibit specific neural patterns and brain topography distributions. We achieved an average accuracy of 87.36% in the discrimination of positive from negative self-induced emotions and 54.52% in the classification of emotions into six discrete categories. Our research will help promote the development of comprehensive endogenous emotion recognition methods.
Highlights
Given that emotion plays an important role in our daily lives and work, the real-time assessment and regulation of emotions can improve our lives
We explored the classification of self-induced emotions by performing three subject-dependent experiments: Movie-Induced Emotion Recognition
We found that the differential entropy (DE) of the Gamma band and the first difference of IMF1 decomposed through empirical mode decomposition (EMD) have good classification performance
Summary
Given that emotion plays an important role in our daily lives and work, the real-time assessment and regulation of emotions can improve our lives. Emotion recognition will facilitate the natural advancement of human–machine interactions and communication. Recognizing the real emotional state of patients, those of patients with expression problems, will help improve the quality of medical care. Emotion recognition based on EEG signals has gained considerable attention. The method of emotion recognition is a crucial factor in human-computer interaction (HCI) systems, which will effectively improve communication between humans and machines [1,2]. Emotion recognition based on EEG signals is challenging given the vague boundaries and individual variations presented by emotions. Researchers have used various affective materials, such as images, sounds, and videos, to elicit emotions
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.