Abstract

Emotion plays an important role in human interaction. People can explain their emotions in terms of word, voice intonation, facial expression, and body language. However, brain–computer interface (BCI) systems have not reached the desired level to interpret emotions. Automatic emotion recognition based on BCI systems has been a topic of great research in the last few decades. Electroencephalogram (EEG) signals are one of the most crucial resources for these systems. The main advantage of using EEG signals is that it reflects real emotion and can easily be processed by computer systems. In this study, EEG signals related to positive and negative emotions have been classified with preprocessing of channel selection. Self-Assessment Manikins was used to determine emotional states. We have employed discrete wavelet transform and machine learning techniques such as multilayer perceptron neural network (MLPNN) and k-nearest neighborhood (kNN) algorithm to classify EEG signals. The classifier algorithms were initially used for channel selection. EEG channels for each participant were evaluated separately, and five EEG channels that offered the best classification performance were determined. Thus, final feature vectors were obtained by combining the features of EEG segments belonging to these channels. The final feature vectors with related positive and negative emotions were classified separately using MLPNN and kNN algorithms. The classification performance obtained with both the algorithms are computed and compared. The average overall accuracies were obtained as 77.14 and 72.92% by using MLPNN and kNN, respectively.

Highlights

  • Emotion is a human consciousness and plays a critical role in rational decision-making, perception, human interaction, and human intelligence

  • The aim of this study was to classify EEG signals related to different emotions based on audiovisual stimuli with the preprocessing of channel selection

  • Feature vectors related to EEG segments consisting of positive and negative emotions were classified by a multilayer perceptron neural network (MLPNN)

Read more

Summary

Introduction

Emotion is a human consciousness and plays a critical role in rational decision-making, perception, human interaction, and human intelligence. While emotions can be reflected through non-physiological signals such as words, voice intonation, facial expression, and body language, many studies on emotion recognition based on these non-physiological signals have been reported in recent decades [1, 2]. Signals obtained by recording voltage changes occurring on skull surface as a result of electrical activity of active neurons in the brain are called EEG [3]. From the clinical point of view, EEG is the mostly used brain-activitymeasuring technique for emotion recognition. EEG-based BCI systems provide a new communication channel by detecting the variation in the underlying pattern of brain activities while performing different tasks [4]. BCI systems have not reached the desired level to interpret people’s emotions

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call