Abstract

This chapter describes three ways in which content moderation by online intermediaries challenges the rule of law: it blurs the distinction between private interests and public responsibilities; it delegates the power to make social choices about content legitimacy to opaque algorithms; and it circumvents the constitutional safeguard of the separation of powers. The chapter further discusses the barriers to accountability in online content moderation by intermediaries, including the dynamic nature of algorithmic content moderation using machine learning; barriers arising from the partialness of data and data floods; and trade secrecy which protects the algorithmic decision-making process. Finally, the chapter proposes a strategy to overcome these barriers to accountability of online intermediaries, namely ‘black box tinkering’: a reverse-engineering methodology that could be used by governmental agencies, as well as social activists, as a check on private content moderation. After describing the benefits of black box tinkering, the chapter explains what regulatory steps should be taken to promote the adoption of this oversight strategy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call