Abstract
Recent developments in AI programming allow for new applications: individualized chatbots which mimic the speaking and writing behaviour of one specific living or dead person. ‘Deathbots’, chatbots of the dead, have already been implemented and are currently under development by the first start-up companies. Thus, it is an urgent issue to consider the ethical implications of deathbots. While previous ethical theories of deathbots have always been based on considerations of the dignity of the deceased, I propose to shift the focus on the dignity and autonomy of the bereaved users of deathbots. Drawing on theories of internet-scaffolded affectivity and on theories of grief, I argue that deathbots may have a negative impact on the grief process of bereaved users and therefore have the potential to limit the emotional and psychological wellbeing of their users. Deathbot users are likely to become dependent on their bots which may make them susceptible to surreptitious advertising by deathbot providing companies and may limit their autonomy. At the same time, deathbots may prove to be helpful for people who suffer from prolonged, severe grief processes. I caution against the unrestricted usage of deathbots and suggest that they should be classified as medical devices. This classification would not the least mean that their non-harm, as well as their helpfulness for people suffering from prolonged grief needs to be proven and that their potential for autonomy infringements is reduced.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.