Abstract

Automated analysis of human affective behaviour has received a significant research attention over the last decade due to its practical importance in areas such as health, human-computer interaction, social robotics, and marketing, to mention but a few. Traditionally, different behavioural cues are first extracted from the sensory inputs (video, speech and/or physiology). Then, machine learning algorithms are designed to analyse these behavioural cues with the aim of automatically predicting target affective states. However, the modelling of human affect (such as emotion expressions or pain levels) is rather challenging due to the many possible sources of variation in target data, including the target subjects (male vs. female, children vs. adults, etc.), their tasks (human-human or human-robot interaction), culture (eastern vs. western), and so on. All of these make the task of automated estimation of human affect highly context-sensitive. In this talk, I will first provide a general overview of recent trends in the field of affective computing. Then, I will illustrate its application to the domain of facial behaviour analysis by focusing on the most recent advances in estimation of human facial behaviour from static images and video data. To this end, I will describe the state-of-the-art machine learning techniques proposed for context-sensitive modelling of human facial expressions of basic emotions, facial action units and their intensity, as well as the clinical measurement of patient's pain levels. Finally, I will outline the main challenges and provide future directions for applying these approaches ‘in-the-wild’, i.e., naturalistic scenarios such as human-robot interaction and in the context of treatment of neuro developmental disorders (such as autism).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.