Abstract

People use the knowledge acquired from past experiences in assessing the trustworthiness of a trustee. In a time where the agents are being increasingly accepted as partners in collaborative efforts and activities, it is critical to understand all aspects of human trust development in agent partners. For human-agent virtual ad hoc teams to be effective, humans must be able to trust their agent counterparts. To earn the humans’ trust, agents need to quickly develop an understanding of the expectation of human team members and adapt accordingly. This study empirically investigates the impact of past experience on human trust in and reliance on agent teammates. To do so, we developed a team coordination game, the Game of Trust (GoT), in which two players repeatedly cooperate to complete team tasks without prior assignment of subtasks. The effects of past experience on human trust are evaluated by performing an extensive set of controlled experiments with participants recruited from Amazon Mechanical Turk, a crowdsourcing marketplace. We collect both teamwork performance data as well as surveys to gauge participants’ trust in their agent teammates. The results show that positive (negative) past experience increases (decreases) human trust in agent teammates; lack of past experience leads to higher trust levels compared to positive past experience; positive (negative) past experience facilitates (hinders) reliance on agent teammates; the relationship between trust in and reliance on agent teammates is not always correlated. These findings provide clear and significant evidence of the influence of key factors on human trust in virtual agent teammates and enhance our understanding of the changes in human trust in peer-level agent teammates with respect to past experience.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call