Abstract

Automation is utilized heavily in many domains to increase productivity. With new, more complex automation, like the self-driving car, humans will be required to forego direct task performance in favor of maintaining a supervisory role over automation systems. While the use of these systems generally results in greater performance than humans performing alone, humans are reluctant to adopt these superior systems due to a lack of trust. The United States Department of Defense is investigating trust in automation in order to influence the rate of adoption of automation technology. Studying trust in automation systems requires a mechanism for quantifying and measuring trust. This paper proposes a method for measuring human trust behaviors with regard to human-automation systems through response rates of compliance and reliance. Using behavioral data from a human-subjects experiment involving automated agents, we create a system dynamics model which relates trust to other system level variables. Using this trust model, engineers will be able to study trust in human-automation team scenarios in order to design automation systems with higher rates of adoption.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call