Abstract

As robots become increasingly common in a wide variety of domains—from military and scientific applications to entertainment and home use—there is an increasing need to define and assess the trust humans have when interacting with robots. In human interaction with robots and automation, previous work has discovered that humans often have a tendency to either overuse automation, especially in cases of high workload, or underuse automation, both of which can make negative outcomes more likely. Frthermore, this is not limited to naive users, but experienced ones as well. Robotics brings a new dimension to previous work in trust in automation, as they are envisioned by many to work as teammates with their operators in increasingly complex tasks. In this chapter, our goal is to highlight previous work in trust in automation and human-robot interaction and draw conclusions and recommendations based on the existing literature. We believe that, while significant progress has been made in recent years, especially in quantifying and modeling trust, there are still several places where more investigation is needed.

Highlights

  • Robots and other complex autonomous systems offer potential benefits through assisting humans in accomplishing their tasks

  • Measurements cannot be conveniently taken during the course of a task but only after the task is completed. This may suffice for automation such as automatic target recognition (ATR) where targets are missed at a fixed rate and the experimenter is investigating the effect of that rate on trust [33], but it does not work in measuring moment to moment trust in a robot reading QR codes to get its directions [30]

  • In this chapter we briefly reviewed the role of trust in human-robot interaction

Read more

Summary

Introduction

Robots and other complex autonomous systems offer potential benefits through assisting humans in accomplishing their tasks. People have been observed to fail to monitor automation properly (e.g. turning off alarms) when automation is in use, or they accept the automation’s recommendations and actions when inappropriate [71, 97] This has been called misuse, complacency, or over-reliance. Disuse can decrease automation benefits and lead to accidents if, for instance, safety systems and alarms are not consulted when needed Another maladaptive attitude is automation bias [33, 55, 77, 88, 112], a user tendency to ascribe greater power and authority to automated decision aids than to other sources of advice (e.g. humans). The utility of introducing an intervening variable between automation performance and operator usage, lies in the ability to make more precise or accurate predictions with the intervening variable than without it This requires that trust in automation be influenced by factors in addition to automation reliability/performance.

Conceptualization of Trust
Modeling Trust
Factors Affecting Trust
System Properties
Properties of the Operator
Environmental Factors
Instruments for Measuring Trust
Trust in Human Robot Interaction
Performance-Based Interaction
Towards Co-adaptive Trust
Social-Based Interactions
Findings
Conclusions and Recommendations
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call