Abstract

ABSTRACT In contemporary society, the rise of social media has dramatically transformed the sharing of information, bypassing traditional editorial and governmental controls. This shift has enabled rapid global information sharing but also raised concerns about the influence of social media platforms, even in democratic societies. Legislative responses, such as Germany's Network Enforcement Act of 2017, mandated swift removal of illegal content, influencing over twenty other nations to enact similar laws. These regulations often target hate speech but risk suppressing political opposition, particularly in authoritarian regimes. The European Union’s Digital Services Act came into force in 2024 and imposes stringent removal obligations on platforms. This enhanced platform regulation pushes companies towards the use of automated content moderation for purposes of meeting those obligations. A 2024 report found a substantial majority (87.5% to 99.7%) of deleted comments on Facebook and YouTube in France, Germany, and Sweden were legally permissible, suggesting that platforms, pages, or channels may be over-removing content to avoid regulatory penalties. Against the backdrop of this recent data, the paper examines the possible impact of stringent platform liability legislation on free speech in the ambit of hate speech, focusing on the delicate balance between regulation and freedom of expression.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.