Abstract
Drawing from science and technology studies (STS) and a feminist law critique, this article argues that procedural law is insufficient when addressing algorithmic discrimination and that ex ante protection may be a better way forward.
Highlights
Technology, discrimination and access to justiceWe discriminate
Drawing from science and technology studies (STS) and feminist critique on law, the article argues that procedural law fails to address algorithmic discrimination, as legal protection is built on datacentrism and individual-centred law
Collective ex ante protection that begins at the community level, builts on self-regulatory component and principle, and is maintained by public authorities keeps the individual act of harm at the core of its operations but takes the responsibility for detecting biased algorithms from that individual and places it in the hands of public authorities, as the individual being discriminated against has few means to gain access to relevant information about algorithms
Summary
By drawing insight from each discipline to complement those areas where they fail to recognise or respond to algorithmic discrimination, a new kind of procedural response emerges This perceives legal procedures as design objects and builds on the understanding that law, technology and humans, as individuals and collectives, are constantly in a co-constitutive movement. The issue of algorithmic discrimination and procedural safeguards has been brought up in several recently published policy papers in the European Union (EU).4 These documents recognise the importance of effective redress and enforcement mechanisms, conducted by official bodies, in addressing both individual and collective harm caused by discriminatory AI tools.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have