Abstract

Affective computing in general and human activity and intention analysis in particular comprise a rapidly-growing field of research. Head pose and emotion changes present serious challenges when applied to player’s training and ludology experience in serious games, or analysis of customer satisfaction regarding broadcast and web services, or monitoring a driver’s attention. Given the increasing prominence and utility of depth sensors, it is now feasible to perform large-scale collection of three-dimensional (3D) data for subsequent analysis. Discriminative random regression forests were selected in order to rapidly and accurately estimate head pose changes in an unconstrained environment. In order to complete the secondary process of recognising four universal dominant facial expressions (happiness, anger, sadness and surprise), emotion recognition via facial expressions (ERFE) was adopted. After that, a lightweight data exchange format (JavaScript Object Notation (JSON)) is employed, in order to manipulate the data extracted from the two aforementioned settings. Motivated by the need to generate comprehensible visual representations from different sets of data, in this paper, we introduce a system capable of monitoring human activity through head pose and emotion changes, utilising an affordable 3D sensing technology (Microsoft Kinect sensor).

Highlights

  • Affective computing in general and human intention analysis comprise a rapidly-growing field of research, due to the constantly-growing interest in applying automatic human activity analysis to all kinds of multimedia recordings involving people

  • We focus in particular on recognising head pose and facial expression changes, which can provide a rich source of information that can be used for analysing human activity in several areas of human computer interaction (HCI)

  • The real-time head pose estimation and facial expression events are separately obtained for different users sitting and moving their head without restriction in front of a Microsoft Kinect sensor for specified intervals

Read more

Summary

Introduction

Affective computing in general and human intention analysis comprise a rapidly-growing field of research, due to the constantly-growing interest in applying automatic human activity analysis to all kinds of multimedia recordings involving people. We focus in particular on recognising head pose and facial expression changes, which can provide a rich source of information that can be used for analysing human activity in several areas of human computer interaction (HCI). Facial expression is one of the most dominant, natural and instantaneous means for human beings to communicate their emotions and intentions [7]. Even though head pose changes provide a rich source of information that can be used in several fields of computer vision, there are no references in the literature regarding subsequent analysis of those findings for the task at hand. The aim of the present work is to develop a framework capable of analysing human activity from depth data of head pose changes and emotion recognition via facial expressions, by visualising them on the web.

Related Work
Overview of Methods for Capturing Head Pose and Emotion Changes
Estimation of Head Pose Changes
Emotion Recognition from Facial Expressions
Data Compilation and Experimental Setup
Visualisations on the Web
Representative Scenario
Head Pose
Head Pose Changes across Time
Head Pose Changes Grouped by Direction
Intensities of Head Pose Changes
Head Pose Changes Grouped by Proportion of the Direction
Emotions
Emotion Changes across Time
11 May 2015
Facial Expressions Grouped by Emotion
Emotions Grouped by Time Intervals
Findings
Conclusions and Future Work
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.