Abstract

As the social media landscape undergoes broad transformation for the first time in over a decade, with alternative platforms like Mastodon, Bluesky, and Threads emerging where X has receded, many users and observers have celebrated the promise of these new services and their visions of alternative governance structures that empower consumers. Drawing on a large-scale textual analysis of platform moderation policies, capabilities, and transparency mechanisms, as well as semistructured group interviews with developers, administrators, and moderators of federated platforms, we found that federated platforms face considerable obstacles to robust and scalable governance, particularly with regard to persistent threats such as coordinated behavior and spam. Key barriers identified include underdeveloped moderation technologies and a lack of sustainable financial models for trust and safety work. We offer four solutions to the collective safety and security risks identified: (1) institutionalize shared responses to critical harms, (2) build transparent governance into the system, (3) invest in open-source tooling, and (4) enable data sharing across instances.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.