Abstract

To date, all stakeholders are working intensively on policy design for artificial intelligence. All initiatives center around the requirement that AI algorithms should be fair. But what exactly does it mean? And how algorithmic fairness can be translated to legal and policy terms? These are the main questions that this paper aims to explore. Each discipline approaches those questions differently. While computer scientists may favor one notion of fairness over others across the board, this paper argues in favor of a case-by-case analysis and application of the relevant fairness notion. The paper discusses the legal limitations of the computer science (CS) notions of fairness and suggests a typology of matching each CS notions to its corresponding legal mechanism. The paper concludes that fairness is contextual. The fact that each notion, or a group of notions correspond with a different legal mechanism, make them suitable for a certain policy domain more than others. Thus, throughout the paper, examples for possible applicability of the CS notions to some policy domains will be introduced. In addition, the paper will highlight for both developers and policy makers the practical steps that need to be taken in order to better address algorithmic fairness. In some instances, notions of fairness that seem, on their face, unproductive from a technical perspective, could in fact be quite helpful from a legal perspective. In other instances, desirable notions in the eye of computer scientists could be challenging to implement in the legal regime, due to the need to determine complex moral and legal questions. Thus, the article emphasizes, a one-size-fits-all solution is not applicable for algorithmic fairness. Rather, an approach that demonstrates a deep understanding of the specific context that a certain algorithm is operating in can guarantee a fairer outcome.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call