Abstract

The Article 29 Data Protection Working Party's recent draft guidance on automated decision-making and profiling seeks to clarify European data protection (DP) law's little-used right to prevent automated decision-making, as well as the provisions around profiling more broadly, in the run-up to the General Data Protection Regulation. In this paper, we analyse these new guidelines in the context of recent scholarly debates and technological concerns. They foray into the less-trodden areas of bias and non-discrimination, the significance of advertising, the nature of “solely” automated decisions, impacts upon groups and the inference of special categories of data—at times, appearing more to be making or extending rules than to be interpreting them. At the same time, they provide only partial clarity – and perhaps even some extra confusion – around both the much discussed “right to an explanation” and the apparent prohibition on significant automated decisions concerning children. The Working Party appears to feel less mandated to adjudicate in these conflicts between the recitals and the enacting articles than to explore altogether new avenues. Nevertheless, the directions they choose to explore are particularly important ones for the future governance of machine learning and artificial intelligence in Europe and beyond.

Highlights

  • BackgroundIn relation to a data subject, Article 22 of the General Data Protection Regulation (GDPR)[1] prohibits (with exceptions) any “decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or significantly affects him or her”

  • In relation to a data subject, Article 22 of the General Data Protection Regulation (GDPR)[1] prohibits any “decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or significantly affects him or her”

  • This right was ported to the GDPR from the Data Protection Directive (DPD) 1995 (arts 12(a) and 15),[2] and itself borrowed from early French

Read more

Summary

Background

In relation to a data subject, Article 22 of the General Data Protection Regulation (GDPR)[1] prohibits (with exceptions) any “decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or significantly affects him or her”. This right was ported to the GDPR from the Data Protection Directive (DPD) 1995 (arts 12(a) and 15),[2] and itself borrowed from early French.

Implications for information and access rights
Implications for Article 22 definitions
Suitable safeguards
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call