Abstract

ABSTRACTAlgorithmic discrimination has become one of the critical points in the discussion about the consequences of an intensively datafied world. While many scholars address this problem from a purely techno-centric perspective, others try to raise broader social justice concerns. In this article, we join those voices and examine norms, values, and practices among European civil society organizations in relation to the topic of data and discrimination. Our goal is to decenter technology and bring nuance into the debate about its role and place in the production of social inequalities. To accomplish this, we rely on Nancy Fraser’s theory of abnormal justice which highlights interconnections between maldistribution of economic benefits, misrecognition of marginalized communities, and their misrepresentation in political processes. Fraser’s theory helps situate technologically mediated discrimination alongside other more conventional kinds of discrimination and injustice and privileges attention to economic, social, and political conditions of marginality. Using a thematic analysis of 30 interviews with civil society representatives across Europe’s human rights sector, we bring clarity to this idea of decentering. We show how many groups prioritize the specific experiences of marginalized groups and ‘see through’ technology, acknowledging its connection to larger systems of institutionalized oppression. This decentered approach contrasts the process-oriented perspective of tech-savvy civil society groups that shy from an analysis of systematic forms of injustice.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call