Abstract

The governance of online platforms has unfolded across three eras – the era of Rights (which stretched from the early 1990s to about 2010), the era of Public Health (from 2010 through the present), and the era of Process (of which we are now seeing the first stirrings). Rights-era conversations and initiatives amongst regulators and the public at large centered dominantly on protecting nascent spaces for online discourse against external coercion. The values and doctrine developed in the Rights era have been vigorously contested in the Public Health era, during which regulators and advocates have focused (with minimal success) on establishing accountability for concrete harms arising from online content, even where addressing those harms would mean limiting speech. In the era of Process, platforms, regulators, and users must transcend this stalemate between competing values frameworks, not necessarily by uprooting Rights-era cornerstones like CDA 230, but rather by working towards platform governance processes capable of building broad consensus around how policy decisions are made and implemented. Some first steps in this direction, preliminarily explored here, might include making platforms information or “content” fiduciaries, delegating certain key policymaking decisions to entities outside of the platforms themselves, and systematically archiving data and metadata about disinformation detected and addressed by platforms.

Highlights

  • Its protections for online speech, designed to safeguard -nascent spaces against external coercion or interference, apply even in cases where intermediaries are actively involved in moderating user content – that’s where CDA 230 was meant to most affect -prevailing law (47 U.S.C. § 230)

  • By providing such a broad legal shield and embedding it in a federal statute that restrained what individual states could do, CDA 230 placed the dynamics of platform liability in amber for the sake of protecting forums for online speech

  • What might otherwise have been a decades-long process of common law development aimed at defining the specific contours of platform liability with respect to harmful content was instead determined in short order

Read more

Summary

From rights to public health

– harms just enough to elicit a solemn acknowledgement that freedoms sometimes come with a cost. The feed contents were ranked by algorithms, optimizing, at least initially, for user engagement above all else, and making little visible distinction between different forms of content They quickly flattened the landscape of internet content, presenting even the most roughhewn of UGC in a common stream and on par with meticulously fact-checked articles by professional journalists. Petersburg-based Internet Research Agency even attempted, by way of Facebook’s Events feature, to convene a town hall meeting in Twin Falls, Idaho to address what it called, in the guise of locals, a “huge upsurge of violence towards American citizens” carried out by refugees (Shane, 2017) Lurid these examples might be, disinformation-related harms often arise less from, say, how many people went to the spurious Twin Falls town hall meeting, and more from the accumulation of chronic patterns of low-level but toxic user behavior, coordinated or otherwise. Public recognition of the real-world costs of this slow poisoning has brought regulators and others to speak quite plainly in terms of what we might call a “Public Health” model of content governance, which concerns itself with modelling and addressing these aggregate effects. Rather than defining through code, architecture, and policy what users can and cannot post, comment, or share, platforms are being asked to don the epidemiologist’s hat and mitigate specific and contextual harms to norms and institutions arising from interactions between users on a massive scale

Process as transcendence
Platforms as content fiduciaries
Delegating content governance
Findings
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call