Abstract

The advantages of automated driving can only come fully into play if these systems are used in an appropriate way, which means that they are neither used in situations they are not designed for (misuse) nor used in a too restricted manner (disuse). Trust in automation has been found to be an essential psychological basis for appropriate interaction with automated systems. Well-balanced system use requires a calibrated level of trust in correspondence with the actual ability of an automated system. As for these far-reaching implications of trust for safe and efficient system use, the psychological processes, in which trust is dynamically calibrated prior and during the use of automated technology, need to be understood. At this point, only a restricted body of research investigated the role of personality and emotional states for the formation of trust in automated systems. In this research, the role of the personality variables depressiveness, self-efficacy, self-esteem, and locus of control for the experience of anxiety before the first experience with a highly automated driving system were investigated. Additionally, the relationship of the investigated personality variables and anxiety to subsequent formation of trust in automation was investigated. In a driving simulator study, personality variables and anxiety were measured before the interaction with an automated system. Trust in the system was measured after participants drove with the system for a while. Trust in the system was significantly predicted by state anxiety and the personality characteristics self-esteem and self-efficacy. The relationships of self-esteem and self-efficacy were mediated by state anxiety as supported by significant specific indirect effects. While for depression the direct relationship with trust in automation was not found to be significant, an indirect effect through the experience of anxiety was supported. Locus of control did not show a significant association to trust in automation. The reported findings support the importance of considering individual differences in negative self-evaluations and anxiety when being introduced to a new automated system for individual differences in trust in automation. Implications for future research as well as implications for the design of automated technology in general and automated driving systems are discussed.

Highlights

  • It is announced that in the years, highly automated vehicles will become an affordable, everyday technology with a broad user group (Hörl et al, 2016)

  • This research underlines the importance of considering anxiety and other emotional states in trust formation when people first get to know automated systems

  • The experience of anxiety when being introduced to the new technology was found to be rooted in the individual tendency to experience depressive symptoms and have negative selfevaluations

Read more

Summary

Introduction

It is announced that in the years, highly automated vehicles will become an affordable, everyday technology with a broad user group (Hörl et al, 2016). If users’ trust is not calibrated, the dangers of disuse (underachieving the full potential of a system) and misuse (overstretching the capabilities of a system) of the system are increased (Parasuraman and Riley, 1997) To prevent these negative outcomes and in order to facilitate trust calibration, a thorough understanding of the psychological processes associated with trust formation and calibration is essential (e.g., Lee and See, 2004; Hoff and Bashir, 2015; Kraus et al, 2019b). These processes could be addressed to personalize the design of automated technology in order to enhance trust calibration (e.g., prior information about the system, training, system functionality, and user interfaces)

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call