Abstract

Regulating for the Future: The Law’s Enforcement Deficit Orla Lynskey (bio) I. Introduction Digitization has changed the way we operate as individuals and as a society in significant ways, with this change accelerated by the Covid-19 pandemic. Digital technologies are now treated as ‘imperative’ across many spheres of life.1 A by-product of this digitization has been datafication, as formerly analogue phenomena need to be translated into data in order to digitize them. As our daily social, professional, administrative, and civic interactions have taken on a digital dimension, the data generated about these transactions are recorded and analysed for insights. The data generated by your morning coffee purchase might therefore reveal your location, your appetite for early mornings, and your price sensitivity, amongst many other insights. The aggregation of such insights across all your daily activities can create remarkably in-depth profiles of individual behaviour. Repeated across the population, datafication facilitates surveillance at a mass scale. The creation and capture of data therefore has clear implications for society. Most evidently, the very act of capturing our interactions through data has privacy implications and could have a chilling effect on the exercise of other rights, including freedom of expression and association. Datafication therefore impacts upon established fundamental rights in democratic societies, with consequences for individuals and society at large. Data also act as a source of power to their holders. Data analytics, including automated decision-making and machine learning, enable the holders of data to categorize and sort individuals in novel ways and to differentiate between cohorts on this basis. While we browse the Internet, the advertisements displayed are targeted based on our profiles, with individuals being grouped into categories to deliver these ads – categories including ‘fertility’, ‘eating disorders’, ‘left-wing politics’ and ‘right-wing politics’, amongst many others.2 Beyond clear privacy implications, the use of such categorizations to differentiate the content that individuals see and the opportunities they are offered offends against ideals of equality, exacerbates asymmetries of power [End Page 104] and information, and has the capacity to further segment already fragmented societies. These data collections have been a source of economic power for their holders and users, but their use goes beyond commerce. The state also engages in such classification-by-data. In England, for instance, resource-pressured local authorities have relied upon automated recommendations to identify children at risk3 while the eligibility of criminal offenders for rehabilitation programmes is assessed, in part, algorithmically, taking into account information scraped from social media sites.4 Increasingly, we see the state operate in tandem with private-sector actors in ways that obscure the flow of data and render effective accountability more difficult to achieve. The challenge for the law, against this backdrop, is to ensure continued respect for the rule of law and fundamental rights. In this contribution, the efforts of the European Union to date will be evaluated with two significant challenges identified. This paper proceeds by, first, providing an overview of the legislative initiatives introduced to regulate the European ‘data economy’. In subsequent sections, it identifies and discusses two key challenges. Firstly, there is an enforcement deficit – to which Ireland contributes – that limits the effectiveness of these laws and detracts from their legitimacy. This enforcement deficit has ironically led to the push for more legislation to plug perceived gaps, which itself may contribute to enforcement pressures. Secondly, and perhaps more fundamentally, there is an ambiguity in European data and digital policy regarding what objectives we seek to promote. While the overarching aim might be to secure rights-respecting innovation, there is little recognition that these two elements – respect for rights and innovation – might be in tension and no clear indication about how conflicts between efficiencies and rights might be resolved. II. The European approach to digital regulation The European approach to digital regulation has been juxtaposed with that of the US, where a liberal ethos and the constraints of the First Amendment protection of speech have characterized its response to digitization, and China, where emphasis has been placed on promoting home-grown tech giants and on data-gathering for surveillance and commercial purposes.5 In contrast, Europe has taken it upon itself to...

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.