Abstract

While social robots bring new opportunities for education, they also come with moral challenges. Therefore, there is a need for moral guidelines for the responsible implementation of these robots. When developing such guidelines, it is important to include different stakeholder perspectives. Existing (qualitative) studies regarding these perspectives however mainly focus on single stakeholders. In this exploratory study, we examine and compare the attitudes of multiple stakeholders on the use of social robots in primary education, using a novel questionnaire that covers various aspects of moral issues mentioned in earlier studies. Furthermore, we also group the stakeholders based on similarities in attitudes and examine which socio-demographic characteristics influence these attitude types. Based on the results, we identify five distinct attitude profiles and show that the probability of belonging to a specific profile is affected by such characteristics as stakeholder type, age, education and income. Our results also indicate that social robots have the potential to be implemented in education in a morally responsible way that takes into account the attitudes of various stakeholders, although there are multiple moral issues that need to be addressed first. Finally, we present seven (practical) implications for a responsible application of social robots in education following from our results. These implications provide valuable insights into how social robots should be implemented.

Highlights

  • IntroductionThe use of social robots in education has been subject to extensive moral debate

  • We aimed to answer the following three research questions: RQ (1) what are the attitudes of stakeholders on the moral issues related to social robots in education? RQ (2) how can the attitudes related to the moral issues be categorized? And RQ (3) what socio-demographic characteristics influence the attitudes of stakeholders on the moral issues related to social robots in education? The results of our study can be used to get a better understanding of the various perspectives on moral considerations related to the use of robots in education

  • To answer RQ 1, we ran a multivariate analysis of variance (MANOVA) to investigate whether attitudes and perceptions of moral issues regarding the use of social robots in education differ by stakeholder group

Read more

Summary

Introduction

The use of social robots in education has been subject to extensive moral debate Their use in early education in particular (e.g., kindergarten and primary school) has raised several ethical issues, ranging from the impact of robots on the role of caregivers and teachers, to issues related to dehumanization, privacy and accountability [1,2,3]. Despite such moral concerns, social robots are increasingly introduced in primary education in the role of a tutor or teacher, and as a peer or a novice [4].

Objectives
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call