Abstract

Automotive navigation systems are becoming ubiquitous as driver assistance systems. Vendors continuously aim to enhance route guidance by adding new features to their systems. However, we found in an analysis of current navigation systems that many share interaction weaknesses, which can damage the system's credibility. Such issues are most prevalent when selecting a route, deviating from the route intentionally, or when systems react to dynamic traffic warnings. In this work, we analyze the impact on credibility and propose improved interaction mechanisms to enhance perceived credibility of navigation systems. We improve route selection and the integration of dynamic traffic warnings by optimizing route comparability with relevance-based information display. Further, we show how bidirectional communication between driver and device can be enhanced to achieve a better mapping between device behavior and driver intention. We evaluated the proposed mechanisms in a comparative user study and present results that confirm positive effects on perceived credibility.

Highlights

  • Automotive navigation systems (ANS) have matured into a mainstream technology

  • While the difference in directly perceived credibility is not significant, believability was rated significantly higher by experimental group (EG) (p = .026), which can be interpreted as an indicator for higher credibility

  • Mental load was low in both groups, but the amount of available information was rated significantly higher by EG (p = .003)

Read more

Summary

Introduction

Automotive navigation systems (ANS) have matured into a mainstream technology. While integrated ANS are mostly found in middle- and higher-range cars, cheaper portable navigation devices (PNDs) enable the addition of ANS into any vehicle. A navigation system’s purpose is to support drivers in traveling from location A to destination B with route guidance. In unfamiliar environments, drivers place higher confidence in navigation commands, while their self-confidence decreases [12]. In such situations, gullibility errors may occur, that is, the driver acts on an erroneous command perceived as credible. Credibility is a perceived quality that reflects the trustworthiness and expertise of a system. Fogg and Tseng [8] define credibility as a perceived quality comprised of a system’s trustworthiness and expertise. Expertise captures the system’s perceived knowledge and capabilities. The perceived quality of a system’s hardware and interface determines surface credibility. Reputed credibility stems from experience reports by others, while experienced credibility results from personal experience

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call