Algorithmic copyright enforcement – the use of automated filtering tools to detect infringing content before it appears on the internet – has a deep impact on the freedom of users to upload and share information. Instead of presuming that user-generated content (UGC) does not amount to infringement unless copyright owners take action and provide proof, the default position of automated filtering systems is that every upload is suspicious and that copyright owners are entitled to ex ante control over the sharing of information online. If platform providers voluntarily introduce algorithmic enforcement measures, this may be seen as a private decision following from the freedom of companies to run their business as they wish. If, however, copyright legislation institutionalizes algorithmic enforcement and imposes a legal obligation on platform providers to employ automated filtering tools, the law itself transforms copyright into a censorship and filtering instrument. Nonetheless, the new EU Directive on Copyright in the Digital Single Market (“DSM Directive”) follows this path and requires the employment of automated filtering tools to ensure that unauthorized protected content does not populate UGC platforms. The new EU rules on UGC licensing and screening will inevitably lead to the adoption of algorithmic enforcement measures in practice. Without automated content control, UGC platforms will be unable to escape liability for infringing user uploads. To provide a complete picture, however, it is important to also shed light on counterbalances which may distinguish this new, institutionalized form of algorithmic enforcement from known content filtering tools that have evolved as voluntary measures in the private sector. The DSM Directive underlines the necessity to safeguard user freedoms that support transformative, creative remixes and mash-ups of pre-existing content. This feature of the new legislation may offer important incentives to develop algorithmic tools that go beyond the mere identification of unauthorized takings from protected works. It has the potential to encourage content assessment mechanisms that factor the degree of transformative effort and user creativity into the equation. As a result, more balanced content filtering tools may emerge in the EU. Against this background, the analysis shows that the new EU legislation not only escalates the use of algorithmic enforcement measures that already commenced in the private sector years ago. If rightly implemented, it may also add an important nuance to existing content identification tools and alleviate the problems arising from reliance on automated filtering mechanisms.
Read full abstract