Abstract

A social robot should be able to autonomously interpret human affect and adapt its behavior accordingly in order for successful social human–robot interaction to take place. This paper presents a modular non-contact automated affect-estimation system that employs support vector regression over a set of novel facial expression parameters to estimate a person’s affective states using a valence-arousal two-dimensional model of affect. The proposed system captures complex and ambiguous emotions that are prevalent in real-world scenarios by utilizing a continuous two-dimensional model, rather than a traditional discrete categorical model for affect. As the goal is to incorporate this recognition system in robots, real-time estimation of spontaneous natural facial expressions in response to environmental and interactive stimuli is an objective. The proposed system can be combined with affect detection techniques using other modes, such as speech, body language and/or physiological signals, etc., in order to develop an accurate multi-modal affect estimation system for social HRI applications. Experiments presented herein demonstrate the system’s ability to successfully estimate the affect of a diverse group of unknown individuals exhibiting spontaneous natural facial expressions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call