Abstract

Facial expressions are one of the commonly used implicit measurements for the in-vehicle affective computing. However, the time courses and the underlying mechanism of facial expressions so far have been barely focused on. According to the Component Process Model of emotions, facial expressions are the result of an individual's appraisals, which are supposed to happen in sequence. Therefore, a multidimensional and dynamic analysis of drivers' fear by using facial expression data could profit from a consideration of these appraisals. A driving simulator experiment with 37 participants was conducted, in which fear and relaxation were induced. It was found that the facial expression indicators of high novelty and low power appraisals were significantly activated after a fear event (high novelty: Z = 2.80, p < 0.01, rcontrast = 0.46; low power: Z = 2.43, p < 0.05, rcontrast = 0.50). Furthermore, after the fear event, the activation of high novelty occurred earlier than low power. These results suggest that multidimensional analysis of facial expression is suitable as an approach for the in-vehicle measurement of the drivers' emotions. Furthermore, a dynamic analysis of drivers' facial expressions considering of effects of appraisal components can add valuable information for the in-vehicle assessment of emotions.

Highlights

  • Over the past decade, affective computing came into the focus of research for driver monitoring systems, because some emotions are supposed to impact drivers’ cognitive capabilities necessary for driving and risk perception (Jeon et al, 2011)

  • The participants’ rating on the Positive and Negative Affect Schedule (PANAS) item “relax” was significantly higher in the baseline condition (BL) scenarios than in the Fear scenarios according to a Wilcoxon test for dependent samples (Z = −4.27, p < 0.001, rcontrast = 0.7)

  • Significant differences between Fear and BL scenarios were found in the SAM dimensions arousal (Z = 3.51, p < 0.001, rcontrast = 0.58) and valence (Z = −3.43, p < 0.001, rcontrast = 0.56)

Read more

Summary

Introduction

Affective computing came into the focus of research for driver monitoring systems, because some emotions are supposed to impact drivers’ cognitive capabilities necessary for driving and risk perception (Jeon et al, 2011). Detecting and mitigating driver emotions by using affective computing in an emotion-aware system may ensure driving safety (Ihme et al, 2019) One idea of such a system is to interpret the user’s emotional state and provide assistance to support users to reduce the negative consequences of certain emotional states (Klein et al, 2002; Tews et al, 2011; Jeon, 2015; Löcken et al, 2017; Ihme et al, 2018). A recent study confirmed that the recognition of emotions from dynamic facial expressions was more accurate than from static ones (Namba et al, 2018), which suggests considering the multidimensional and dynamic nature of emotions for affective computing may increase the possibility for a practical implementation of emotion-aware systems. Investigating the multidimensional and dynamic nature of drivers’ emotion would contribute to the development of reliable in-vehicle emotion measurement

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call