In making a myriad things more efficient and convenient, advanced data analytics are also opening our lives to unjustifiable biases and unseen manipulations. It is, however, less often considered that algorithmic decision making is also progressively changing the nature of liberal democratic forms of governance. In particular, they are rendering direct public participation in transparency and accountability mechanisms unsustainable. In this paper, I address that question in three steps. First, using the idea of participatory accountability, the paper explains the importance of information access rights to public engagement in governance. Outwardly, these rights are dispersed across distinctively different fields of law, including freedom of information, data protection and litigation laws and procedures. Yet, in practice, they not only share similarities in design and application, but also benefit from a contemporary political ethos that celebrates public knowledge and scrutiny of authority and power. Their impact on liberal democratic governance, moreover, often lies in their mass and random power to force the disclosure of information. Participatory accountability in the United Kingdom, however, is largely focused on these transparency tools, leaving citizens to pursue accountability as they see fit through diverse processes of litigation, judicial review, regulatory complaint or media publicity. Second, the paper turns to the complexity and opacity of data analytics and the challenge this poses for direct public participation in algorithmic governance. This challenge goes well beyond a lack of technical expertise in the general public. Deliberately restricted by design and doctrine to limit compliance burdens and disclosure risks, public information access rights cannot be used to compel controllers of algorithmic decision making to provide systemic explanations of their data sources, purposes and outputs. In short, participatory accountability is under-equipped for the future and is, consequently, losing its force and direction. Solutions will, furthermore, need to be as transformational as the societal consequences of data analytics. The answer does not, however, lie in enhanced access rights enabling the public to demand systemic explanations at will. The compliance and disclosure harms of that mass empowerment would undoubtedly outweigh its public transparency benefits. Third, the paper proposes workable solutions that avoid both the harms of mass empowerment and the alternative of remote and paternalist forms of algorithmic governance. Taking inspiration from the European Court of Human Rights judgement in Magyar Helsinki Bizottsag v. Hungary recognising an ECHR Article 10 derivative information access right, I argue that enhanced public access rights, including a power to compel systemic explanations of algorithmic decision making, should be recognised for qualifying media and societal watchdog organisations and individuals. In addition, where full public transparency is not possible, regulatory and self-regulatory mechanisms should be opened up to greater public participation, in particular for such watchdog organisations. As trusted intermediaries representing the public interest, statutory regulators should, in particular, resist the current rationales of paternalism in algorithmic governance. Relaxation of legal barriers to direct public agency in accountability processes, as exemplified by the Court of Appeal’s recent judgement in Lloyd v Google regarding representative actions, should also be pursued. While a cautious approach to enhanced litigation rights is understandable, more flexibility in representative actions will be necessary as data analytics move beyond the knowledge and resources of the average citizen or consumer and the assistance of better equipped watchdog organisations becomes essential.
Read full abstract