Abstract

There are concerns that UK policing could soon be awash with 'algorithmic impropriety'. Big(ger) data and machine learning-based algorithms combine to produce opportunities for better intelligence-led management of offenders, but also creates regulatory risks and some threats to civil liberties - even though these can be mitigated. In constitutional and administrative law terms, the use of predictive intelligence analysis software to serve up 'algorithmic justice' presents varying human rights and data protection problems based on the manner in which the output of the tool concerned is deployed. But regardless of exact context, in all uses of algorithmic justice in policing there are linked fears; of risks around potential fettering of discretion, arguable biases, possible breaches of natural justice, and troubling failures to take relevant information into account. The potential for 'data discrimination' in the growth of algorithmic justice is a real and pressing problem. This paper seeks to set out a number of arguments, using grounds of judicial review as a structuring tool, that could be deployed against algorithmically-based decision making processes that one might conceivably object to when encountered in the UK criminal justice system. Such arguments could be used to enhance and augment data protection and/or human rights grounds of review, in this emerging algorithmic era, for example, if a campaign group or an individual claimant were to seek to obtain a remedy from the courts in relation to a certain algorithmically-based decision-making process or outcome.

Highlights

  • As well as great opportunities, there are considerable negative dimensions and risks of using algorithmic intelligence analysis in the context of what has been categorised as 'high stakes public sector decision-making'[1]

  • This paper seeks to set out a number of arguments, using grounds of judicial review as a structuring tool, that could be deployed against algorithmically-based decision making processes that one might conceivably object to when encountered in the UK criminal justice system

  • This is a piece that seeks to set out a number of administrative law arguments, using grounds of judicial review as a structuring tool, that could be deployed against algorithmically-based decision making processes that one might conceivably object to when encountered in the UK criminal justice system

Read more

Summary

Introduction

As well as great opportunities, there are considerable negative dimensions and risks of using algorithmic intelligence analysis in the context of what has been categorised as 'high stakes public sector decision-making'[1]. Campaign group Big Brother Watch have weighed in on the issue of the privacy risks and the corollary threats to personal liberty of this practice, while this technology is largely inaccurate at the moment, and relatively lightly regulated to boot2324 In another relatively high-profile trial of algorithmically predictive policing of offenders, albeit it in the charging decision context, Durham Constabulary have been training their algorithmic 'Harm Assessment Risk Tool' (HART) to inform decision-making by custody officers in the future. The Gangs Matrix case study offered here shows that 'high stakes public sector decision-making' that draws on algorithmic intelligence analysis may not be predicated in any case, currently, on specific statutory provisions, but it is regulated to a considerable degree not just by human rights law and administrative law, but by the Equality Act 2010 and the newer provisions of the Data Protection Act 2018. There are related issues of determining the level of intrusion that is arguably required to be reached before authorisation (and accountability ) is required under the Regulation of Investigatory Powers Act 2000, for the obtaining of the OSINT data that can form part of the set of data points to be used by an algorithmic assessment tool[38]

Algorithmic improprieties?
Decisional opacity
Known accuracy biases and public policy
Discussion of the impact of the Data Protection Act 2018
10. Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call