Abstract

This paper discusses the ethical nature of empathetic and sympathetic engagement with social robots, ultimately arguing that an entity which is engaged with through empathy or sympathy is engaged with as an “experiencing Other” and is as such due at least “minimal” moral consideration. Additionally, it is argued that extant HRI research often fails to recognize the complexity of empathy and sympathy, such that the two concepts are frequently treated as synonymous. The arguments for these claims occur in two steps. First, it is argued that there are at least three understandings of empathy, such that particular care is needed when researching “empathy” in human-robot interactions. The phenomenological approach to empathy—perhaps the least utilized of the three discussed understandings—is the approach with the most direct implications for moral standing. Furthermore, because “empathy” and “sympathy” are often conflated, a novel account of sympathy which makes clear the difference between the two concepts is presented, and the importance for these distinctions is argued for. In the second step, the phenomenological insights presented before regarding the nature of empathy are applied to the problem of robot moral standing to argue that empathetic and sympathetic engagement with an entity constitute an ethical engagement with it. The paper concludes by offering several potential research questions that result from the phenomenological analysis of empathy in human-robot interactions.

Highlights

  • Sympathetic and empathetic robots have become an increasingly popular topic of research within HRI

  • Is it preferable to have social robots that we can genuinely sympathize with—to open ourselves to what is given in experience, the datum of foreign experience—but which will not show sympathy? Or should robots which elicit sympathy show sympathy, even though it may be perceived as inauthentic? These questions are very different than those which are typically discussed in relation to robot moral standing and are of a more empirical than normative nature

  • I have argued that the debate over empathy in human-robot interactions has largely failed to recognize the distinctions between the three types of empathy on the one hand, and sympathy on the other

Read more

Summary

INTRODUCTION

Sympathetic and empathetic robots have become an increasingly popular topic of research within HRI. I will discuss three broad notions of empathy which researchers should have in mind when employing the concept, as well as offer a novel definition of sympathy that makes clear the distinction between empathy and sympathy and the connections of both phenomena to ascriptions of moral standing. Section two will briefly present the empathy and sympathy concepts, as well as discuss why the distinction matters and consider how the terms have been used within extant HRI research, while placing an emphasis on the valuable insights from phenomenological understandings of empathy—which have been insufficiently considered—and on the important empathy-sympathy distinction. I will argue there that a phenomenological understanding of empathy suggests that empathetic or sympathetic engagement with a robot already constitutes an ethical engagement (i.e., engagement with the robot as one which possesses at least “minimal” moral standing). The approach to robot moral standing offered here is similar to, yet distinct from, the relational approaches to robot moral standing that have been offered by David Gunkel and Mark Coeckelbergh (Coeckelbergh, 2012; Gunkel, 2012; Gunkel, 2018a; Coeckelbergh, 2018), and is based primarily on the phenomenological understanding of empathy offered by Edith Stein (1964) and Max Scheler and Health (1923))

EMPATHY AND SYMPATHY
Empathy
Defining Sympathy
Experiencing Otherness and Moral Standing
ROBOT MORAL STANDING
A Return to Sympathy
CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call