Abstract

ABSTRACT With focus on toxic recommender algorithms, the article argues that the platform immunities granted in the early 2000s are incompatible with the regulated self-regulation of the EU Digital Services Act 2022 [DSA] and the UK Online Safety Act 2023 [OSA]. Whilst the DSA/OSA’s regimes recognise platforms as active shapers of the online content sphere, the immunities remain stuck in the early view of the same platforms as passive providers of neutral infrastructure and so as innocent messengers. Thus the immunities continue to shelter platforms from liability for harm caused by their toxic-but-profitable algorithms that the DSA/OSA seek to restrain. The tension between the two regimes may be resolved by restricting the immunities to truly neutral platforms, that is those not substantially invested in the content they are meant to regulate. For all other platforms the DSA/OSA self-regulatory regimes and standard liabilities could and should run in parallel.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.