Abstract
Emotional deception and emotional attachment are regarded as ethical concerns in human-robot interaction. Considering these concerns is essential, particularly as little is known about longitudinal effects of interactions with social robots. We ran a longitudinal user study with older adults in two retirement villages, where people interacted with a robot in a didactic setting for eight sessions over a period of 4 weeks. The robot would show either non-emotive or emotive behavior during these interactions in order to investigate emotional deception. Questionnaires were given to investigate participants' acceptance of the robot, perception of the social interactions with the robot and attachment to the robot. Results show that the robot's behavior did not seem to influence participants' acceptance of the robot, perception of the interaction or attachment to the robot. Time did not appear to influence participants' level of attachment to the robot, which ranged from low to medium. The perceived ease of using the robot significantly increased over time. These findings indicate that a robot showing emotions—and perhaps resulting in users being deceived—in a didactic setting may not by default negatively influence participants' acceptance and perception of the robot, and that older adults may not become distressed if the robot would break or be taken away from them, as attachment to the robot in this didactic setting was not high. However, more research is required as there may be other factors influencing these ethical concerns, and support through other measurements than questionnaires is required to be able to draw conclusions regarding these concerns.
Highlights
Awareness of, and a growing interest in, ethical considerations for the development of social robots is increasing due to the predicted increasing likelihood of robots being a part of our everyday lives in the future (Malle et al, 2015; Esposito et al, 2016; Li et al, 2019)
The aims of this study are to establish whether the ethical concerns of emotional deception and emotional attachment that have been established in the literature are reflected in practice
This study investigated how emotional deception and emotional attachment may relate to acceptance of the robot and perception of the social interaction, as these will be indicators for the future development of ethically safe socially assistive robots
Summary
A growing interest in, ethical considerations for the development of social robots is increasing due to the predicted increasing likelihood of robots being a part of our everyday lives in the future (Malle et al, 2015; Esposito et al, 2016; Li et al, 2019). Emotional deception could occur when the user believes that the robot really experiences these emotions, leading to unrealistic expectations that can possibly result in the user prioritizing the robot’s well-being over other people’s or their own well-being, as well as over-relying on the robot as a social assistant without exerting one’s own critical judgment (Fulmer et al, 2009). Another ethical concern is the possible development of emotional attachment to the robot (Sullins, 2012), which may cause distress in the user when the robot breaks or is taken away. The step between utilizing robots for cognitively healthy older adults to vulnerable older adults that suffer from e.g., dementia is small, and baseline requirements found through studies with healthy older adults are essential to ensure that it is ethically safe for vulnerable older adults to interact with the robot
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.