Abstract

Music can evoke a variety of emotions, which may be manifested by distinct signals on the electroencephalogram (EEG). Many previous studies have examined the associations between specific aspects of music, including the subjective emotions aroused, and EEG signal features. However, no study has comprehensively examined music-related EEG features and selected those with the strongest potential for discriminating emotions. So, this paper conducted a series of experiments to identify the most influential EEG features induced by music evoking different emotions (calm, joy, sad, and angry). We extracted 27-dimensional features from each of 12 electrode positions then used correlation-based feature selection method to identify the feature set most strongly related to the original features but with lowest redundancy. Several classifiers, including Support Vector Machine (SVM), C4.5, LDA, and BPNN, were then used to test the recognition accuracy of the original and selected feature sets. Finally, results are analyzed in detail and the relationships between selected feature set and human emotions are shown clearly. Through the classification results of 10 random examinations, it could be concluded that the selected feature sets of Pz are more effective than other features when using as the key feature set to classify human emotion statues.

Highlights

  • Recognition of emotion state is an important aim for the development of advanced brain-computer interfaces (BCIs)

  • Numerous studies have identified EEG signals associated with distinct features of music, including familiarity, level of processing, phrase rhythm, and subjective emotional response. ammasan et al extracted power density spectra and fractal dimensions from the Database for Emotion Analysis using Physiological Signals (DEAPs) and found that using low familiarity music improved the accuracy of recognition regardless of whether the classifier was support vector machine (SVM), multilayer perception, or C4.5 [3]

  • Kumagai et al investigated the relationship between cortical response and familiarity of music. ey found that the two peaks of the cross-correlation values were significantly larger when listening to unfamiliar or scrambled music compared with familiar music

Read more

Summary

Introduction

Recognition of emotion state is an important aim for the development of advanced brain-computer interfaces (BCIs). To compensate for the lack of single feature recognition, many previous studies have attempted to extract one or several linear or nonlinear dynamic characteristics of EEG signals for distinguish different music stimuli by machine learning. Few studies have conducted a comprehensive unbiased analysis of whole-brain EEG signals associated with music-evoked emotions and selected those with highest discriminative power for various classifiers [24]. We want to obtain the most influential EEG signal feature set of the human emotion classification To achieve this goal, 18-dimension linear features and 9-dimension nonlinear features were extracted for every electrode, and the correlation-based feature selection (CFS) method was employed to select the influential feature set. E experiment results showed that the selected feature set of Pz electrode and the classification method C4.5 were more effective in human emotion recognition To verify the influence of the selected feature set, the classification methods including BPNN, SVM, C4.5, and LDA were used in the procedure of human emotion classification. e experiment results showed that the selected feature set of Pz electrode and the classification method C4.5 were more effective in human emotion recognition

Methods
Data Analysis
F4 C3 C4 P3 P4 O1 O2 F7 F8 T3 T4
Experiments and Analysis
Findings
Method
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call