The aim of this doctrinal legal study is to analyze the interplay between the vulnerability of groups in algorithmic systems and the protection of collective interests in data protection law in Brazil's legal system. Two research questions are raised: (i) Is the protection of personal data regulation applicable to data processing activities related to algorithmic groups? and (ii) can algorithmic groups be regarded as groups with vulnerability under the LGPD legal regime? This article is divided into three parts apart from the introduction, and combines three strands of research, namely group rights theory, vulnerability studies, and law and technology perspective. This combination is key to outline, in Sections 2 and 3, a theoretical framework that elucidates the concepts of collective data protection and group vulnerability mapping both onto the notion of algorithmic groups. Section 2 argues for the collective dimension of the right to the protection of personal data as the foundation of a collective data protection. Section 3, in turn, explores the conceptualization of group vulnerability and how this discourse resonates with algorithmic groups in the onlife world. I draw on vulnerability studies, and on Mireille Hildebrandt's law and technology perspective to delineate what do I mean by group vulnerability and how do I articulate theoretically this notion with algorithmic groups and the affordances of algorithmic systems. Section 4 examines the relation between collective data protection and vulnerability of algorithmic groups under the data protection legal framework in Brazil. To answer the research questions, the analysis is concentrated on three aspects of Brazilian data protection law: (i) the “collectivization of data protection”; (ii) the integration of group vulnerability in the data protection legal framework; (iii) data protection impact assessments in the context of LGPD's risk-based approach. The collective dimension of the right to personal data protection is increasingly recognized in Brazilian law through class-action litigation, particularly in the context of addressing vulnerabilities caused by new data-driven technologies. This collective dimension should guide courts and the Brazilian DPA in interpreting and applying the LGPD, especially Art. 12, § 2, regarding group data processing by algorithmic profiling systems. Data protection law in Brazil acknowledges that groups of data subjects may face vulnerability, requiring special protection and safeguards to mitigate risks and violations. Group vulnerability signals contexts deserving special attention and serves as a source of obligations and rights. Within LGPD's risk-based approach, mandatory DPIAs in ML-based algorithmic profiling systems help identify vulnerable groups and implement appropriate safeguards to mitigate risks of harm or rights violations. Non-compliance with safeguard implementation obligations should be considered a breach of Brazilian data protection laws.
Read full abstract