Abstract

ABSTRACTTrust in automation has been largely studied through a cognitive lens, though theories suggest that emotions play an important role. Understanding the affective aspects of human-automation trust can inform the design of systems that garner appropriate trust calibration. Toward this, we designed 4 videos describing a hypothetical drone system: one control, and three with additional performance or process information, or both. Participants reported the intensity of 19 emotions they would anticipate as system operator, perceptions of the system’s trustworthiness, individual differences, and perceptions of the institution behind the system. Emotions factored into hostility, positive, anxiety, and loneliness components that were regressed on system information, individual differences, and institutional trust. We found that financial risk-taking, recreational risk-taking, and propensity to trust influenced the intensity of different emotion factors. Moreover, greater perceptions of the institution’s ability led to more intense hostility emotions, greater perceptions of the institution’s benevolence led to less intense hostility, and integrity perceptions decreased anxiety and increased positive emotions. Lastly, structural assurance led to less intense hostility and anxiety and more intense positive emotions. These results offer support for the relationship between human-automation trust and emotions, warranting future research on how operator emotions can be addressed to improve trust calibration.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call