Abstract

ABSTRACT In this paper, we argue that the Online Safety Act 2023 and Ofcom’s guidance incentivise online platforms to adopt a ‘Bypass Strategy’, where they create and enforce content moderation rules that are broader than existing criminal law to bypass judgements of illegal content. This strategy aims to avoid complex legal interpretations of criminal intent and potential defences but would be unfeasible considering the volume of content on social media platforms and incompatible with automated moderation tools. We argue, however, that the Bypass Strategy, driven by the Act’s focus on illegal content and by the lack of clarity in Ofcom’s proposed guidance, poses a significant threat to users’ freedom of expression and incentivises overremoval of legitimate speech. We offer insights that could help Ofcom to improve its guidance on how platforms should interpret such duties on moderating content and might mitigate this risk within the constraints of the Act.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.