Abstract

Emotion recognition in real-time using electroencephalography (EEG) signals play a key role in human-computer interaction and affective computing. The existing emotion recognition models, that use stimuli such as music and pictures in controlled lab settings and limited number of emotion classes, have low ecological validity. Moreover, for effective emotion recognition identifying significant EEG features and electrodes is important. In our proposed model, we use the DEAP dataset consisting of physiological signals collected from 32 participants as they watched 40 movie (each of 60 seconds) clips. The main objective of this study is to explore multi-domain (time, wavelet, and frequency) features and hence, identify the set of stable features which contribute towards emotion classification catering to a larger number of emotion classes. Our proposed model is able to identify nine classes of emotions including happy, pleased, relaxed, excited, neutral, calm, distressed, miserable, and depressed with an average accuracy of 65.92%. Towards this end, we use support vector machine as a classifier along with 10-fold and leave-one-out cross-validation techniques. We achieve a significant emotion classification accuracy which could be vital towards developing solutions for affective computing and deal with a larger number of emotional states.

Highlights

  • Affective computing is a specialized field of artificial intelligence (AI) and is used for processing, interpreting, and identifying emotional states

  • For the classification of nine emotional states, our proposed method achieved an average accuracy of 65.72% (10-fold) and 65.92% (LOOCV), which to the best of our knowledge is the highest among state-of-the-art methods

  • LOOCV was used to further verify the results and perform subject-independent analysis, where each subject was considered as test instance once while the remaining were considered for training

Read more

Summary

Introduction

Affective computing is a specialized field of artificial intelligence (AI) and is used for processing, interpreting, and identifying emotional states. Emotions play a vital role in our daily life activities including decision making, communication, and personal development. Emotions are natural to us as human beings, a significant attention has been given for detecting emotions during human-robot interaction to enable affective computing [1]. It was shown that computers that can recognise and respond to human emotions are critical for the progress of human computer interaction [2]. It was shown that analysis of affective physiological signals can lead towards machine intelligence. It was observed that using physiological signals can be more beneficial for machine intelligence than using vocal or visual data [3].

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call