Abstract

Children are keen consumers of audiovisual media content. Video-sharing platforms (VSPs), such as YouTube and TikTok, offer a wealth of child-friendly or child-appropriate content but also content which—depending on the age of the child—might be considered inappropriate or potentially harmful. Moreover, such VSPs often deploy algorithmic recommender systems to personalise the content that children are exposed to (e.g., through auto-play features), leading to concerns about diversity of content or spirals of content related to, for instance, eating disorders or self-harm. This article explores the responsibilities of VSPs with respect to children that are imposed by existing, recently adopted, and proposed EU legislation. Instruments that we investigate include the Audiovisual Media Services Directive, the General Data Protection Regulation, the Digital Services Act, and the proposal for an Artificial Intelligence Act. Based on a legal study of policy documents, legislation, and scholarship, this contribution investigates to what extent this legislative framework sets obligations for VSPs to safeguard children’s rights and discusses how these obligations align across different legislative instruments.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call