Abstract

Recent research shows that how we respond to other social actors depends on what sort of mind we ascribe to them. In a comparative manner, we observed how perceived minds of agents shape people’s behavior in the dictator game, ultimatum game, and negotiation against artificial agents. To do so, we varied agents’ minds on two dimensions of the mind perception theory: agency (cognitive aptitude) and patiency (affective aptitude) via descriptions and dialogs. In our first study, agents with emotional capacity garnered more allocations in the dictator game, but in the ultimatum game, agents’ described agency and affective capacity, both led to greater offers. In the second study on negotiation, agents ascribed with low-agency traits earned more points than those with high-agency traits, though the negotiation tactic was the same for all agents. Although patiency did not impact game points, participants sent more happy and surprise emojis and emotionally valenced messages to agents that demonstrated emotional capacity during negotiations. Further, our exploratory analyses indicate that people related only to agents with perceived affective aptitude across all games. Both perceived agency and affective capacity contributed to moral standing after dictator and ultimatum games. But after negotiations, only agents with perceived affective capacity were granted moral standing. Manipulating mind dimensions of machines has differing effects on how people react to them in dictator and ultimatum games, compared to a more complex economic exchange like negotiation. We discuss these results, which show that agents are perceived not only as social actors, but as intentional actors through negotiations, in contrast with simple economic games.

Highlights

  • Philosophical explorations on what a mind is and how we perceive it has been an active area of inquiry (e.g., Dennett 2008)

  • We designed different types of minds of virtual robots that varied along the dimensions described by mind perception theory (MPT) in order to see the resulting influence on human interactants’ behavior in the dictator game (DG), ultimatum game (UG), and negotiations

  • We report people’s perception of the machine’s moral standing and mind (MPT), exploratory analyses of people’s emotion states, and how people related to the machine (IOS) for both studies; scales can be found in the “Appendix”

Read more

Summary

Introduction

Philosophical explorations on what a mind is and how we perceive it has been an active area of inquiry (e.g., Dennett 2008). How to empirically test our perception of other minds, on if and how we perceive technological entities to have minds, is a relatively new project. How we are affected when we perceive an artificial agent to have a mind is critical to explore with a growing number of digital beings entering our everyday environments. According to the mind perception theory (MPT), the mind is assessed on two dimensions: agency, which encompasses cognition, and patiency, which encompasses emotions [4]. The ability to attribute mental states to oneself and/or others is known as having a theory of mind [11]. The most commonly attributed mental state is intent, according to Premack and Woodruff (1978). People have a tendency to be biased towards their own minds as a frame of reference when interacting with humans and artificial agents [5]

Methods
Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.