Abstract
Robot animals, designed to mimic living beings, pose ethical challenges in the context of caring for vulnerable patients, specifically concerning deception. This paper explores how emotions become a resource for dealing with the misinformative nature of robot animals in dementia care homes. Based on observations of encounters between residents, care workers, and robot animals, the study shows how persons with dementia approach the ambiguous robots as either living beings, material artifacts, or something in-between. Grounded in interactionist theory, the research demonstrates that emotions serve as tools in the sense-making process, occurring through interactions with the material object and in collaboration with care workers. The appreciation of social robots does not solely hinge on them being perceived as real or fake animals; persons with dementia may find amusement in "fake" animals and express fear of "real" ones. This observation leads us to argue that there is a gap between guidelines addressing misinformation and robots and the specific context in which the technology is in use. In situations where small talk and play are essential activities, care workers often prioritize responsiveness to residents rather than making sure that the robot's nature is transparent. In these situations, residents' emotional expressions serve not only as crucial resources for their own sense-making but also as valuable indicators for care workers to comprehend how to navigate care situations.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.