Abstract

In the future, automated cars may feature external human–machine interfaces (eHMIs) to communicate relevant information to other road users. However, it is currently unknown where on the car the eHMI should be placed. In this study, 61 participants each viewed 36 animations of cars with eHMIs on either the roof, windscreen, grill, above the wheels, or a projection on the road. The eHMI showed ‘Waiting’ combined with a walking symbol 1.2 s before the car started to slow down, or ‘Driving’ while the car continued driving. Participants had to press and hold the spacebar when they felt it safe to cross. Results showed that, averaged over the period when the car approached and slowed down, the roof, windscreen, and grill eHMIs yielded the best performance (i.e., the highest spacebar press time). The projection and wheels eHMIs scored relatively poorly, yet still better than no eHMI. The wheels eHMI received a relatively high percentage of spacebar presses when the car appeared from a corner, a situation in which the roof, windscreen, and grill eHMIs were out of view. Eye-tracking analyses showed that the projection yielded dispersed eye movements, as participants scanned back and forth between the projection and the car. It is concluded that eHMIs should be presented on multiple sides of the car. A projection on the road is visually effortful for pedestrians, as it causes them to divide their attention between the projection and the car itself.

Highlights

  • A substantial number of studies have emerged on external human–machine interfaces for automated cars

  • If the path planning software of the automated driving system knows that the vehicle will slow down for an upcoming intersection, the external human–machine interfaces (eHMIs) could communicate that the vehicle is about to slow down [1]

  • The eye-tracking results showed that the Windscreen eHMI yielded a concentrated gaze pattern, which can be explained by the fact that this eHMI is embedded in the centre of the car

Read more

Summary

Introduction

A substantial number of studies have emerged on external human–machine interfaces (eHMIs) for automated cars. Found that pedestrians often look at the wheels of parked cars; this provides motivation for using a wheel-based eHMI At present, it is unclear which location of the eHMI results in the best-perceived clarity and behavioural compliance among pedestrians. We examined which type of eHMI resulted in the highest time-percentage of spacebar pressings while the automated vehicle slowed down for the participant. This is a continuous behavioural measurement method that was introduced by De Clercq et al [1]. The development of commanding-text eHMIs is technologically challenging, because such design requires that the automated vehicle knows for which road user the command is meant. Because the present study is concerned with examining the effect of eHMI location, we selected an eHMI design that was shown to be effective in previous research in virtual environments

Participants
Apparatus
Independent Variable
Design of the Animated Video Clips
Procedure and Task
Dependent Variables
Statistical
Self-Reported
Performance for Approaching Cars
Percentage of participants who pressed thethe spacebar average was
Screenshot of the animation a straight approach the Projection
Screenshot of the in the approach case with thethe
10. Overall
Eye-Movements for Approaching Cars
Discussion
Performance
Eye-Tracking
Self-Reports
Limitations and Recommendations
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.