Abstract

The introduction of highly automated driving systems is expected to significantly change in-vehicle interactions, creating opportunities for the design of novel use cases and interactions for occupants. In this study, we sought to identify and extract these novel use cases and determine preliminary auditory display recommendations for these novel situations. We developed and generated use cases for level 4 automated vehicles through an expert workshop (N = 17) and online focus group interviews (N = 12). Most of the use cases we generated were then tested, apart from meditation, and user opinions were collected in a driving simulator study (N = 20). Results indicated participants were interested in functions that support their experience with both driving and non-driving related interactions in highly automated vehicles. Three categories of use cases for level 4 automated vehicles were developed: driving automation use cases, immersion use cases, and in-vehicle notification use cases. For the driving simulator study, we tested three display modalities for interaction with drivers: visual alert only, non-speech with visual, and speech with visual. In terms of situation awareness (SA), the non-speech with visual display was associated with significantly better SA for the use case consisting of a planned increase in automation level than the speech-with visual display. This study will provide guidance on sonification design to advance user experiences in highly automated vehicles.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call