Abstract
This paper discusses the problem of responsibility attribution raised by the use of artificial intelligence (AI) technologies. It is assumed that only humans can be responsible agents; yet this alone already raises many issues, which are discussed starting from two Aristotelian conditions for responsibility. Next to the well-known problem of many hands, the issue of “many things” is identified and the temporal dimension is emphasized when it comes to the control condition. Special attention is given to the epistemic condition, which draws attention to the issues of transparency and explainability. In contrast to standard discussions, however, it is then argued that this knowledge problem regarding agents of responsibility is linked to the other side of the responsibility relation: the addressees or “patients” of responsibility, who may demand reasons for actions and decisions made by using AI. Inspired by a relational approach, responsibility as answerability thus offers an important additional, if not primary, justification for explainability based, not on agency, but on patiency.
Highlights
In response to recent progress and successes in artificial intelligence (AI), especially machine learning applications, ethics of AI has become a popular topic in academic and public discussions about the future of technology
This paper focuses on the question of responsibility attribution for artificial intelligence technologies used in the automation of actions and decisions usually made by humans
This was an overview of some problems concerning responsibility for AI, with a focus on the problem of responsibility attribution and responsibility as answerability
Summary
In response to recent progress and successes in artificial intelligence (AI), especially machine learning applications, ethics of AI has become a popular topic in academic and public discussions about the future of technology. It is argued that the demand for explainability is justified via the knowledge condition (know what you are doing as an agent of responsibility) but should be based on the moral requirement to provide reasons for a decision or action to those to whom you are answerable, to the responsibility patients.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.