Abstract

This paper investigates the enforcement stage of the content moderation process on social media platforms. It argues that the current approach adopted by most platforms is underdeveloped, poses serious human rights issues, and would benefit from a number of reforms. The enforcement stage is arguably the most important stage in the moderation process because it has the most impact on what content is available and how the platform is governed. The enforcement stage is also where the concerns over private companies regulating the exercise of free expression online are at their more pronounced. This will be a central theme throughout the paper as it is a troubling situation despite laws like the Network Enforcement Act in Germany and decisions such as Google Spain requiring platforms to take on an increasing number of legal assessments. This paper will first explain what the role of the moderator entails and how the enforcement process occurs at social media platforms. The investigation will then turn to the enforcement process itself and identify a number of problems in how content rules are generally enforced. It will consider bias in decision-making, a problem that exists for both human and algorithmic moderators. This paper will use the example of blood and how menstrual blood is treated by platforms as infinitely more objectionable than blood resulting from graphic violence or serious accidents (despite it being a normal aspect of women’s health). This differential treatment stems from the rules created by a male-dominated Silicon Valley, which embodies a particular form of cultural bias. Then the paper will examine over-reliance on efficiency as a solution, where platforms respond to controversies by offering solutions that promise enhanced speed or technical effectiveness without addressing the underlying issues. For example, platforms often fixate on how fast content can be removed after it is posted, a narrative that this paper will argue has also influenced political discussions around social media. The paper will move on to investigate the inconsistent enforcement of terms and conditions, which increases user uncertainty and raises concerns that some narratives are being privileged over others. For example, the Facebook Files leak featured multiple slides prohibiting any positive statements about the Irish Republican Army (the IRA) but no accompanying slides indicating that the same treatment should be applied to other para-military groups from Northern Ireland. Finally, it will provide a number of suggestions for reform such as the adoption of a body of precedents as a tool for accountability and empowerment of users. This should be publicly accessible and would contain anonymised case-studies to help users understand what is and is not acceptable. It will also recommend moving away from the efficiency narrative to consider larger issues in enforcement (which are not so easily solved) such as human rights and rule of law issues, and more emphasis on trouble-shooting problems before they become major scandals.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call