Abstract

Computerized systems are often employed to support control and decision-making tasks in complex and dynamic environments. Trust or mistrust in these systems has been demonstrated to significantly affect operator performance. Consequently, errors of trust or mistrust may compromise system performance, with potentially disastrous results. Accordingly, trust should be considered in both the design and operation of human/machine systems. In order to do so, metrics and methods for the measurement of trust must be developed along with models of human performance that incorporate trust and related system variables. Current approaches to trust measurement rely solely on subjective metrics, which are based on different theoretical concepts of trust between humans that may not necessarily be as relevant to machines. Although researchers have been able to establish a relationship between trust and behavior, these models lack an analytic foundation. Therefore, the purpose of this research is to develop a quantitative approach that relates trust to changes in system parameters and severity of errors.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.