Abstract

Affinity profiling - grouping people according to their assumed interests rather than solely their personal traits - has become commonplace in the online advertising industry. Online platform providers use behavioural advertisement (OBA) and can infer very sensitive information (e.g. ethnicity, gender, sexual orientation, religious beliefs) about individuals to target or exclude certain groups from products and services, or to offer different prices. OBA and affinity profiling raise at least three distinct legal challenges: privacy, non-discrimination, and group level protection. Current regulatory frameworks may be ill-equipped to sufficiently protect against all three harms. I first examine several shortfalls of the General Data Protection Regulation (GDPR) concerning governance of sensitive inferences and profiling. I then show the gaps of EU non-discrimination law in relation to affinity profiling in terms of its areas of application (i.e. employment, welfare, goods and services) and the types of attributes and people it protects. I propose that applying the concept of ‘discrimination by association’ can help close some of these gaps in legal protection against OBA. This concept challenges the idea of strictly differentiating between assumed interests and personal traits when profiling people. Failing to acknowledge the potential relationship – be it directly or indirectly - between assumed interests and personal traits could render non-discrimination ineffective. Discrimination by association occurs when a person is treated significantly worse than others (e.g. not being shown an advertisement) based on their relationship or association (e.g. assumed gender or affinity) with a protected group. Crucially, the individual does not need to be a member of the protected group to receive protection. Protection does not hinge on whether the measure taken is based on a protected attribute that an individual actually possesses, or on their mere association with a protected group. Discrimination by association would help to overcome the argument that inferring one’s ‘affinity for’ and ‘membership in’ a protected group are strictly unrelated. Not needing to be a part of the protected group, as I will argue, also negates the need for people who are part of the protected group to ‘out’ themselves as members of the group (e.g. sexual orientation, religion) to receive protection, if they prefer. Finally, individuals who have been discriminated against but are not actually members of the protected group (e.g. people who have been misclassified as women) could also bring a claim. Even if these gaps are closed, challenges remain. The lack of transparent business models and practices could pose a considerable barrier to prove non-discrimination cases. Finally, inferential analytics and AI expand the circle of potential victims of undesirable treatment in this context by grouping people according to inferred or correlated similarities and characteristics. These new groups are not accounted for in data protection and non-discrimination law. I close with policy recommendations to address each of these legal challenges for OBA and affinity profiling.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call