Abstract

ABSTRACT This paper takes an interdisciplinary approach, combining legal studies, ethics and technical insights to shed light on the complex issues surrounding the regulation of predictive analytics. We argue that the individualised concept of regulation, shaped by the dogma of fundamental rights, is unable to adequately capture the implications of predictive analytics. We show that predictive analytics is a problem of collective privacy and informational power asymmetries, and conceptualise the form of data power at work in predictive analytics as ‘prediction power’. The unregulated prediction power of certain actors poses societal risks, especially if this form of informational power asymmetry is not normatively represented. The article analyses this legal lacuna in the light of recent case law of the European Court of Justice and new legislation at the EU level. To address these challenges, we develop the concept of ‘predictive privacy’ as a protected good based on collective interests.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call