Abstract

Human-robot and human-cyborg interactions requiring trust are increasingly common in the marketplace, workplace, on the road, and in the home, yet little is known about human willingness to make trust-based investments with non-human agents acting alone (i.e., “robots”), or bound to welfare of non-deciding humans (i.e., “cyborgs”). Even less is known about the emotional reactions these interactions elicit. While other-regarding models of social preferences predict more trust-based investment in interactions that can benefit others, we see no difference across conditions in investments – only differences in emotional reactions to the trust-based interaction outcomes. The Recalibrational model of emotions predicts whether particular emotions are reported following trust-game interactions with people. Here we extend those emotion predictions to analogous trust based interactions, but with robots and cyborgs that violate certain expectations of human-human relationships. Using a between-subjects design, we compare investment and emotions from human-human trust games to investment and emotions from nearly identical trust games (a.k.a. “risk games”) that humans play with a robot or with a cyborg. Between conditions we find different emotional reactions but fail to find differences in investment behavior. These results highlight a unique emotional facet of human interaction while providing support for the Recalibration model of emotions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call