Abstract

Advanced Driver-Assistance Systems (ADASs) are used for increasing safety in the automotive domain, yet current ADASs notably operate without taking into account drivers’ states, e.g., whether she/he is emotionally apt to drive. In this paper, we first review the state-of-the-art of emotional and cognitive analysis for ADAS: we consider psychological models, the sensors needed for capturing physiological signals, and the typical algorithms used for human emotion classification. Our investigation highlights a lack of advanced Driver Monitoring Systems (DMSs) for ADASs, which could increase driving quality and security for both drivers and passengers. We then provide our view on a novel perception architecture for driver monitoring, built around the concept of Driver Complex State (DCS). DCS relies on multiple non-obtrusive sensors and Artificial Intelligence (AI) for uncovering the driver state and uses it to implement innovative Human–Machine Interface (HMI) functionalities. This concept will be implemented and validated in the recently EU-funded NextPerception project, which is briefly introduced.

Highlights

  • In recent years, the automotive field has been pervaded by an increasing level of automation.This automation has introduced new possibilities with respect to manual driving

  • We introduce a novel perception architecture for Advanced Driver-Assistance Systems (ADASs) based on the idea of Driver Complex

  • In the fields of traffic research and Human–Computer Interactions (HCIs), several studies have been published in the past decades about the detection and exploitation of the driver cognitive and emotional states, given the significant impact that such conditions have on driver performance and consequent effects on road safety [57,58]

Read more

Summary

A Roadmap

Luca Davoli 1, * , Marco Martalò 1 , Antonio Cilfone 1 , Laura Belli 1 , Gianluigi Ferrari 1 , Roberta Presta 2 , Roberto Montanari 2,3 , Maura Mengoni 4 , Luca Giraldi 4 , Elvio G. Amparore 5 , Marco Botta 5 , Idilio Drago 5 , Giuseppe Carbonara 3 , Andrea Castellano 3 and Johan Plomp 6. EMOJ s.r.l., Spin-Off of the Polytechnic University of Marche, 60131 Ancona, Italy;. Received: 11 October 2020; Accepted: 10 December 2020; Published: 12 December 2020

Introduction
Survey Methodology
Article Structure
ADAS Using Driver Emotion Recognition
Emotions Recognition in the Automotive Field
Facial Expression and Emotion Recognition
Valence and Engagement
Further Factors Influencing Driving Behavior
Emotional Effects on Driving Behavior
Emotion Induction and Emotion Regulation Approaches
Emotion Recognition Technologies in the Vehicle
Strategies to Build Human–Automation Cooperation
AI Components in ADAS
Sensing Components for Human Emotion Recognition
Inertial Sensors
Camera Sensors
Sensor-Equipped Steering Wheel and Wearables
Issues with Sensing Technologies
Statement and Vision
Development Process
Use Case on Driven Behavior Recognition
Improving Emotion Recognition for Vehicle Safety
Improving Assistance Using Accurate Driver Complex State Monitoring
Experimental Setup and Expected Results
Recommendations and Future Directions
Findings
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call