Abstract

Extensive possibilities of applications have rendered emotion recognition ineluctable and challenging in the fields of computer science as well as in human-machine interaction and affective computing. Fields that, in turn, are increasingly requiring real-time applications or interactions in everyday life scenarios. However, while extremely desirable, an accurate and automated emotion classification approach remains a challenging issue. To this end, this study presents an automated emotion recognition model based on easily accessible physiological signals and deep learning (DL) approaches. As a DL algorithm, a Feedforward Neural Network was employed in this study. The network outcome was further compared with canonical machine learning algorithms such as random forest (RF). The developed DL model relied on the combined use of wearables and contactless technologies, such as thermal infrared imaging. Such a model is able to classify the emotional state into four classes, derived from the linear combination of valence and arousal (referring to the circumplex model of affect’s four-quadrant structure) with an overall accuracy of 70% outperforming the 66% accuracy reached by the RF model. Considering the ecological and agile nature of the technique used the proposed model could lead to innovative applications in the affective computing field.

Highlights

  • Over the past few years, the study of emotion recognition has attracted growing interest, and presents an increasing trend in computer science

  • A confusion matrix (CM) represents a performance metric commonly employed in machine learning (ML) classification tasks

  • The CM related to the random forest (RF) emotion classification based on the cardiac features (i.e., heart rate variability (HRV) and blood volume pulse (BVP)) is reported in Figure 5a, the overall accuracy reached in this classification measured 59.5%

Read more

Summary

Introduction

Over the past few years, the study of emotion recognition has attracted growing interest, and presents an increasing trend in computer science. Emotion prediction and recognition play a vital role in various domains such as digital multimedia entrainment, self-driving, healthcare, and human–computer interface [1]. It has been shown that communication between humans and computers benefits from sensor-based emotion recognition since humans experience discomfort when emotions are absent [2]. Reeves et al stated that people treat computers the same way they treat people [3]. Computers should show empathy to their users. Emotions are further essential for motivation and learning [4]. Affective interaction could be beneficial and improve one’s mental state

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call