Abstract

In human-robot interactions, people tend to attribute to robots mental states such as intentions or desires, in order to make sense of their behaviour. This cognitive strategy is termed “intentional stance”. Adopting the intentional stance influences how one will consider, engage and behave towards robots. However, people differ in their likelihood to adopt intentional stance towards robots. Therefore, it seems crucial to assess these interindividual differences. In two studies we developed and validated the structure of a task aiming at evaluating to what extent people adopt intentional stance towards robot actions, the Intentional Stance task (IST). The Intentional Stance Task consists in a task that probes participants’ stance by requiring them to choose the plausibility of a description (mentalistic vs. mechanistic) of behaviour of a robot depicted in a scenario composed of three photographs. Results showed a reliable psychometric structure of the IST. This paper therefore concludes with the proposal of using the IST as a proxy for assessing the degree of adoption of the intentional stance towards robots.

Highlights

  • Intentional Stance Towards RobotsHumans readily attribute intentionality and mental states to living and non-living entities such as robots (Fisher, 1991; Fletcher et al, 1995; Epley et al, 2007)

  • To test the validity of the Intentional Stance task (IST), we evaluated the external validity of the task in relation to anthropomorphic attributions (measured by the Human-Robot Interaction Evaluation Scale (Spatola et al, 2020)

  • The first experiment aimed at examining the psychometric structure of Marchesi et al (2019) material and at identifying inter-individual differences in the tendency to adopt the intentional stance towards robots Based on the factorial analysis we were able to extract 12 items grouped in two factors with reliable fit indices

Read more

Summary

Introduction

Intentional Stance Towards RobotsHumans readily attribute intentionality and mental states to living and non-living entities such as robots (Fisher, 1991; Fletcher et al, 1995; Epley et al, 2007). A more efficient strategy (in terms of predictions) would be to adopt the “intentional stance,” which assumes that mental states are the underlying explanations of the observed behaviour (Thellman et al, 2017; De Graaf and Malle, 2019). Within this context, social robots represent a particular category of artefacts explicitly designed to potentially elicit the adoption of the intentional stance (for a review see Perez-Osorio and Wykowska (2020). The attribution of mental states to robots increases the acceptance of robots with more positive attitudes

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call