Abstract

PurposeThe paper examines the content moderation practices and related public disclosures of the World's most popular social media organizations (SMOs). It seeks to understand how content moderation operates as a process of accountability to shape and inform how users (inter)act on social media and how SMOs account for these practices.Design/methodology/approachContent analysis of the content moderation practices for selected SMOs was conducted using a range of publicly available data. Drawing on seminal accountability studies and the concepts of hierarchical and holistic accountability, the authors investigate the design and appearance of the systems of accountability that seek to guide how users create and share content on social media.FindingsThe paper unpacks the four-stage process of content moderation enacted by the World's largest SMOs. The findings suggest that while social media accountability may allow SMOs to control the content shared on their platforms, it may struggle to condition user behavior. This argument is built around the limitations the authors found in the way performance expectations are communicated to users, the nature of the dialogue that manifests between SMOs and users who are “held to account”, and the metrics drawn upon to determine the effectiveness of SMOs content moderation activities.Originality/valueThis is the first paper to examine the content moderation practices of the World's largest SMOs. Doing so extends understanding of the forms of accountability that function in the digital space. Crucial future research opportunities are highlighted to provoke and guide debate in this research area of escalating importance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call