Abstract
This article will discriminate between kinds of robot, point to its burgeoning development and application in the home and workplace, and describe its growing use in the classroom as a teacher. It will describe its potential to support, for instance, language development, social, and emotional training [e.g., for children with an autistic spectrum disorder (ASD)], and teaching and assessment, and will review researchers', teachers', students', and parents' responses to this use. Some of these responses recognize the potential usefulness of humanoid robots, but also show an awareness that digital “thought” (AI) is not the same as human thought (HI), and show some caution about using robots as teachers. This disparity generates problems and dilemmas. These stem from, for example, a lack of discretion in decision-making, a lack of emotion (other than by simulation), a limited creative ability (in the foreseeable future), the nature of AI/HI relationships, ethical/legal matters, and culturally unsuitable programming. These matters point to the need for forethought about robot roles and for a code of practice for teachers who work with them. Derived from the discussion, such a code is proposed. The introduction of robot teachers will have significant implications for teachers' roles and their professional identity as human teachers move from being often solitary sources of learning to becoming teaching and learning managers who need to provide learning opportunities creatively. The change in teacher identity and the teacher's roles is described.
Highlights
Automation, the replacement of people in the workplace by machines is not something new, but digital technology has increased the capabilities of these machines enormously
Students in different parts of the world, and even in one region, are likely to vary in the amount and kind of access they have to digital technology, including robot teachers
There is a danger that we will drift into the future without forethought about how to use and not use robot teachers (SCAI, 2018)
Summary
Reviewed by: Jacqueline Joy Sack, University of Houston, United States Karen Lenore Taylor, International School of Geneva, Switzerland. It will describe its potential to support, for instance, language development, social, and emotional training [e.g., for children with an autistic spectrum disorder (ASD)], and teaching and assessment, and will review researchers’, teachers’, students’, and parents’ responses to this use Some of these responses recognize the potential usefulness of humanoid robots, and show an awareness that digital “thought” (AI) is not the same as human thought (HI), and show some caution about using robots as teachers. These stem from, for example, a lack of discretion in decision-making, a lack of emotion (other than by simulation), a limited creative ability (in the foreseeable future), the nature of AI/HI relationships, ethical/legal matters, and culturally unsuitable programming These matters point to the need for forethought about robot roles and for a code of practice for teachers who work with them.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have