The pervasive growth of algorithmic enforcement magnifies current debates regarding the virtues of transparency. Not only does using codes to conduct robust online enforcement amplify the settled problem of magnitude, or “too-much-information,” often associated with present-day disclosures, it imposes additional practical difficulties on relying on transparency as an adequate check for algorithmic enforcement. In this Essay we explore the virtues of black box tinkering methodology as means of generating accountability in algorithmic systems of online enforcement. Given the far-reaching implications of algorithmic enforcement of online content for public discourse and fundamental rights, we advocate active public engagement in checking the practices of automatic enforcement systems.Accordingly, we explain the inadequacy of transparency in generating public oversight. First, it is very difficult to read, follow and predict the complex computer code which underlies algorithms as it is inherently non-transparent and capable of evolving according to different patterns of data. Second, mandatory transparency requirements are irrelevant to many private implementations of algorithmic governance which are subject to trade secrecy. Third, algorithmic governance is so robust that even without mandatory transparency it is impossible to review all the information already disclosed. Fourth, when algorithms are called on to replace humans in making determinations that involve discretion, transparency about the algorithms’ inputs (the facts) and outputs (the outcomes) is not enough to allow adequate oversight. This is because a given legal outcome does not necessarily yield sufficient information about the reasoning behind it. Subsequently we establish the benefits of black box tinkering as a proactive methodology that encourages social activism, using the example of a recent study of online copyright enforcement practices by online intermediaries. That study sought to test systematically how hosting websites implement copyright policy by examining the conduct of popular local image-sharing platforms and popular local video-sharing platforms. Particularly, different types of infringing, non-infringing and fair use materials were uploaded to various hosting facilities, each intended to trace choices made by the black box system throughout its enforcement process. The study’s findings demonstrate that hosting platforms are inconsistent, therefore unpredictable in detecting online infringement and enforcing copyrights: some platforms allow content that is filtered by others; some platforms strictly respond to any notice requesting removal of content despite its being clearly non-infringing, while other platforms fail to remove content upon notice of alleged infringement. Moreover, many online mechanisms of algorithmic copyright enforcement generally do very little in terms of minimizing errors and ensuring that interested parties do not abuse the system to silence legitimate speech and over-enforce copyright. Finally, the findings indicate that online platforms do not make full efforts to secure due process and allow affected individuals to follow, and promptly respond to, proceedings that manage their online submissions.Based on these findings, we conclude that black box tinkering methodology could offer an invaluable grasp of algorithmic enforcement practices on the ground. We hence evaluate the possible legal implications of this methodology and propose means to address them.
Read full abstract