Abstract

Since emotions play an important role in the daily life of human beings, the need and importance of automatic emotion recognition has grown with increasing role of human computer interface applications. Emotion recognition could be done from the text, speech, facial expression or gesture. In this paper, we concentrate on recognition of “inner” emotions from electroencephalogram (EEG) signals. We propose real-time fractal dimension based algorithm of quantification of basic emotions using Arousal-Valence emotion model. Two emotion induction experiments with music stimuli and sound stimuli from International Affective Digitized Sounds (IADS) database were proposed and implemented. Finally, the real-time algorithm was proposed, implemented and tested to recognize six emotions such as fear, frustrated, sad, happy, pleasant and satisfied. Real-time applications were proposed and implemented in 3D virtual environments. The user emotions are recognized and visualized in real time on his/her avatar adding one more so-called “emotion dimension” to human computer interfaces. An EEG-enabled music therapy site was proposed and implemented. The music played to the patients helps them deal with problems such as pain and depression. An EEG-based web-enable music player which can display the music according to the user’s current emotion states was designed and implemented.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.