Abstract

In this article, we discuss the strong relationship between affect and cognition and the importance of emotions in Multimodal Human Computer Interaction (HCI) and User-Modeling. We introduce the overall paradigm for our multi-modal system that aims at recognizing its users' emotions and at responding to them accordingly depending upon the current context or application. We then describe the design of the emotion elicitation experiment we conducted by collecting, via wearable computers, physiological signals from the autonomic nervous system (galvanic skin response, heart rate, temperature) and mapping them to certain emotions (Sadness, Anger, Fear, Surprise, Frustration, and Amusement). We show the results of three different supervised learning algorithms that categorize these collected signals in terms of emotions, and generalize their learning to recognize emotions from new collections of signals. We finally discuss possible broader impact and possible applications of emotion recognition for multimodal intelligent systems.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.