Abstract

Content moderation has exploded as a policy, advocacy, and public concern. But these debates still tend to be driven by high-profile incidents and to focus on the largest, US based platforms. In order to contribute to informed policymaking, scholarship in this area needs to recognise that moderation is an expansive socio-technical phenomenon, which functions in many contexts and takes many forms. Expanding the discussion also changes how we assess the array of proposed policy solutions meant to improve content moderation. Here, nine content moderation scholars working in critical internet studies propose how to expand research on content moderation, with implications for policy.

Highlights

  • By Tarleton Gillespie and Patricia AufderheideContent moderation scholarship faces an urgent challenge of relevance for policy formation

  • Content moderation – the detection of, assessment of, and interventions taken on content or behaviour deemed unacceptable by platforms or other information intermediaries, including the rules they impose, the human labour and technologies required, and the institutional mechanisms of adjudication, enforcement, and appeal that support it—has exploded as a public, advocacy, and policy concern: from harassment to misinformation to hate speech to self-harm, across questions of rights, labour, and collective values

  • A more explicitly political research programme that foregrounds public policy and regulatory studies should help us better understand content moderation as political relationship between what I have previously outlined as a platform governance triangle of political actors (Gorwa, 2019a): individual firms and industry associations; non-governmental civil society groups, individuals, journalists, and researchers; regulators, policymakers, and various political institutions

Read more

Summary

Introduction

Content moderation scholarship faces an urgent challenge of relevance for policy formation. Their size makes them desirable venues for bad faith actors eager to have an impact Their policies and techniques set a standard for how content moderation works on other platforms, they offer the most visible examples, and they drive legislative con-. Any policy enacted to regulate moderation or curb online harms, 2 while it may reasonably have Facebook or YouTube in its sights, will probably in practice apply to all platforms and user-content services. As a starting point to a larger discussion, each of our authors provides a suggestion for expanding the study of content moderation, with the ultimate goal of sound policy grounded in human rights and open societies. The third considers the future of governance, governmental regulation

Looking beyond Facebook
The future of regulating content moderation
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call