Abstract

In this paper, we propose an emotion classification model, which can differentiate human-like emotions by using visual and electroencephalography (EEG) dynamic features. To understand human emotions in a more natural situation, we use dynamic stimuli such as movies for the analysis. We incorporate the 3D fuzzy GIST to effectively describe both dynamic visual features and EEG signals. The extracted features are used as inputs to an adaptive neuro-fuzzy inference system (ANFIS). The classifier is provided with the mean opinion scores as the teaching signals. Experimental results show that the system using both low-level visual feature and semantic level EEG feature not only discriminates the positive emotional features from the negative ones but also can get the more stable result than the model using only visual or EEG information.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call