Abstract

With the advancement of Human Computer interaction and affective computing, emotion estimation becomes a very interesting area of research. In literature, the majority of emotion recognition systems presents an insufficiency due to the complexity of processing a huge number of physiological data and analyzing various kind of emotions in one framework. The aim of this paper is to present a rigorous and effective computational framework for humans affect recognition and classification through arousal valence and dominance dimensions. In the proposed algorithm, physiological instances from the multimodal emotion DEAP dataset has been used for the analysis and characterization of emotional pattern. Physiological features were employed to predict VAD levels via Extreme Learning Machine. We adopted a feature-level fusion to exploit the complementary information of some physiological sensors in order to improve the classification performance. The proposed framework was also evaluated in a V–A quadrant by predicting four emotional classes. The obtained results proves the robustness and correctness of our proposed framework compared to other recent studies. We can also confirm the sufficiency of the R-ELM when it was applied for the estimation and recognition of emotional responses.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call