Abstract
Humans use two distinct cognitive strategies separately to understand and predict other humans' behavior. One is mind-reading, in which an internal state such as an intention or an emotional state is assumed to be a source of a variety of behaviors. The other is behavior-reading, in which an actor's behavior is modeled based on stimulus-response associations without assuming internal states behind the behavior. We hypothesize that anthropomorphic features are key for an observer switching between these two cognitive strategies in a competitive situation. We provide support for this hypothesis through two studies using four agents with different appearances. We show that only a human agent was thought to possess both the ability to generate a variety of behaviors and internal mental states, such as minds and emotions (Study 1). We also show that humans used mixed (opposing) strategies against a human agent and exploitative strategies against the agents with mechanical appearances when they played a repeated zero-sum game (Study 2). Our findings show that humans understand that human behavior is varied; that humans have internal states, such as minds and emotions; that the behavior of machines is governed by a limited number of fixed rules; and that machines do not possess internal mental states. Our findings also suggest that the function of mind-reading is to trigger a strategy for use against agents with variable behavior and that humans exploit others who lack behavioral variability based on behavior-reading in a competitive situation.
Highlights
Humans sometimes attribute minds to intelligent machines, as shown by the case of HAL in the film “2001: A Space Odyssey.” To what types of agent do humans attribute minds? According to the Machiavellian intelligence hypothesis, human intelligence has been evolutionarily shaped, and the mind developed to handle complex social environments (Byrne and Whiten, 1988)
This might be because behavioral variability is an important factor in producing an alternative way to reach goals and is explicitly modeled in artificial intelligence algorithms (Newell et al, 1959; Russell and Norvig, 1995; Sutton and Barto, 1998), it is not included in the folk concept of “intelligence.” Instead, items that indicate behavioral variability such as is not mechanical, does not have a limited behavioral pattern, and does not act according to predefined rules were included in the “social intelligence” factor
This finding indicates that the ability to generate a variety of behaviors and the presence of internal states, such as a mind and emotions, are related in the folk concept
Summary
Humans sometimes attribute minds to intelligent machines, as shown by the case of HAL in the film “2001: A Space Odyssey.” To what types of agent do humans attribute minds? According to the Machiavellian intelligence hypothesis, human intelligence has been evolutionarily shaped, and the mind developed to handle complex social environments (Byrne and Whiten, 1988). Humans sometimes attribute minds to intelligent machines, as shown by the case of HAL in the film “2001: A Space Odyssey.”. To what types of agent do humans attribute minds? We address mind attribution to intelligent agents in a competitive situation. One of the main properties of intelligence is the ability to generate unlimited behavioral patterns to reach a given goal (Byrne, 1995; Roth and Dicke, 2005). This ability, which is known as searching or exploration (Newell et al, 1959; Sutton and Barto, 1998), enables agents to find novel
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have