Abstract

Electroencephalographic (EEG) based emotion recognition has attracted increasing attention in the field of human-computer interaction (HCI). But, how to use the cognitive principles to enhance the emotion recognition model is still a challenge. The purpose of this research paper is to investigate the emotion cognitive process and its application. Firstly, to evoke the response emotions, a three-stage experimental paradigm of long-time music stimuli was designed. The EEG signals were recorded in 15 healthy adults during the listening of 16 music clips respectively. Then, the time course analysis method of music-evoked emotions was proposed to examine the differences of brain activities. There was great increase of the spectral power in alpha band and a slight decrease in high-frequency beta and gamma bands during music listening. Through time analysis, the characteristics of the inspiring–keeping–fading were also found in different emotional states. After that, the most relevant EEG features were selected based on the time correlation analysis between EEG and music features. Finally, based on the cognitive principles inspired EEG features, an emotional prediction system was built. From the results, the accuracies of binary classification were 66.8% for valence and 59.5% for arousal. The accuracies of 3-classes classification performed as 45.9% for valence and 45.1% for arousal. These results suggest that with the help of cognitive principles, a better emotional recognition system could be built. Understanding the cognitive process could promote the development of artificial intelligence.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call