Abstract

A key to optimize a user’s entertainment or learning experience when playing interactive games is to understand his emotional responses. Current methods mostly exploit intrusive physiological signals to detect a player’s emotions. In this study, we proposed a method to detect a player’s emotions based on heart beat (HR) signals and facial expressions (FE). In this work, a continuous recognition of HR and FE through videos captured by Kinect2.0 is conducted considering the continuous perception of the human emotion. Bidirectional long and short term memory (Bi-LSTM) network is used to learn the HR features, and convolutional neural network (CNN) is trained to learn the FE features. To further meet the demands for real-time, the SOM-BP network is employed to fuse the HR and FE features, which can perfectly recognize the player’s emotion. Experimental results demonstrate our model has high accuracy and low computation time for four emotions of “excitement”, “anger”, “sadness” and “calmness” in different games. Moreover, the emotion’s intensity can be estimated by the HR value.

Highlights

  • Nowadays more and more users are attracted by computer games owing to their ability to present information interactively and playfully

  • The game is becoming more abundant as time goes by, which is gradually used to help users solve practical problems such as work, education and life

  • Our model provides a non-contact way to make use of heart beat (HR) and facial expressions (FE) for emotion recognition

Read more

Summary

INTRODUCTION

Nowadays more and more users are attracted by computer games owing to their ability to present information interactively and playfully. Du et al.: Non-Contact Emotion Recognition Combining Heart Rate and FE for Interactive Gaming Environments recognition These methods usually do not work due to the background music of games interferes with the player’s voice. Reference [12] proposed a method for automatic emotion detection based on a player’s body movements in the sports game. In method [16], a classifier with deep convolutional network features could track the player’s facial expressions in real time with an optimal recognition rate of 94.4%. These methods based on facial expressions show great performance.

METHODS
APPARATUS
Findings
CONCLUSION
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.