Abstract

Abstract Human conversational partners usually try to interpret the speaker´s or listener´s affective cues and respond to them accordingly. Recently, the modelling and simulation of such behaviours has been recognized as an essential factor for more natural man-machine communication. The implicit emotion channels of human communication such as speech, facial expression, gesture, and physiological responses are used in general to extract emotion-relevant features for the computational perception of emotion. So far, research on emotion recognition has mostly dealt with offline analysis of recorded emotion corpora, and online processing (in realtime or near realtime) has hardly been addressed. Online processing is, however, a necessary prerequisite for the realization of human-computer interfaces that analyze and respond to the user´s emotions while he or she is interacting with an application. In this paper, we first describe how we recognize emotions from various modalities including speech, gestures and biosignals. We then present Smart Sensor Integration (SSI), a framework which we developed to meet the specific requirements of online emotion recognition.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call