Both human-based and automated decisions are shaped by national legislation, international regulations as well as on platforms community guidelines or Terms of Service, a set of private rules that control what is allowed and what is not allowed on digital platforms. Such restrictions of what may or may not be deemed acceptable content are taking place in a context of increasing “digital authoritarianism” in which global internet freedom has declined for eleven consecutive years . Moreover, content moderation practices are currently shaped by corporate interests, within increasingly concentrated markets that are influenced by the network effects of the digital economy . Although it is well recognised that the same rights that people have offline must also be protected online, there are increasing examples of legislation and companies” Terms of Service which are leading to restrictions on freedom of expression and other human rights . On the other hand, most of national regulations constitute fragmented regulatory attempts and do not touch upon several issues that could propel content moderation practices that could better respect human rights, such as design features or proceedings closer to responsive regulation models. In this paper we argue that, despite its limitations and its formal legal status, the United Nations Guiding Principles on Business and Human Rights (UNGPs) offer a good starting point to address some of these problems and encourage rules and procedures for moderating user-generated online content that put human rights at the very centre of those decisions. Indeed, UNGPs provide the foundations for the development of a responsive normative framework that is consistent across borders, addressing problems generated by fragmented regulatory attempts and of multinational corporations operating in diverse contexts. In an area where there are no easy or clear-cut solutions, UNGPs provide procedural safeguards to collectively address the problems generated in content moderation. Moreover, the normative framework provided by the UNGPs constitute the backbone against which to judge governments that are adopting deficient public policies or to address different issues in the absence of regulation. Lastly, UNGPs greatest innovation is the creation of an independent duty to respect human rights for powerful private actors such as social media platforms that in some cases accumulate more power and institutional capacities than a significant share of governments around the globe. Although previous literature has already adopted a human rights approach to content moderation, we here attempt to provide a closer look upon the specific role that UNGPs may play in addressing the challenges of what has been called the “hidden industry” of content moderation within “mostly submerged systems of technological governance” . In the first section, we briefly describe and analyse the features of UNGPs that we think make them an adequate normative framework to address the human rights challenges that arise in the context of regulating and moderating user-generated content in the online world. We argue that, despite the limitations of an experimental and polycentric approach, the UNGPs have supported some progress, especially when it comes to the recognition of human rights standards in the tech sector. In the second section, we provide the reader with a brief introduction to ways in which content regulatlion/moderation has negatively affected the human right to freedom of expression. The following section addresses the relationship between UNGPs and the duties or roles that each actor could play in addressing the isse: governments, corporations, civil society organizations and individuals. Finally, we end with some concluding remarks and future research questions.
Read full abstract