Abstract

Extended Reality (XR) is the umbrella term to address Augmented, Virtual, and Mixed Reality interfaces. XR interfaces also enables natural and intuitive interaction with secondary driving tasks like maps, music, calls and so on without the need to take the eyes off the road. This work evaluates ISO 9241 pointing task in XR interfaces and analyzes the ease of interaction, physical, and mental efforts required in augmented and mixed reality interfaces. With fully automated vehicles becoming an everyday reality is still some research years away, the drivers in a semi-automated vehicle must be prepared to intervene upon a Take Over Request. In such cases, the human drivers may not be required to have full manual control of the vehicle throughout its driving operation but interfere as and when required and perform passive supervision the other times. In this paper, we evaluate the impacts of using XR interfaces in assisting drivers in taking over requests, and during the first second of controlling the vehicles. A prototype of a simulated semi-autonomous driving assistance system is developed with similar interfaces in AR and MR. User studies were performed for a comparative analysis of mixed reality with augmented reality displays/interfaces based on response time to take over requests. In both ISO 9241 Pointing Task and automotive task, the AR interface took significantly less time than the MR interface in terms of task performance. Participants also reported significantly less requirement of mental and physical effort in using screen-based AR interfaces than HoloLens based MR interface.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call