Abstract

ABSTRACT Concentration of power in terms of user traffic and copyright content is most evident in content platforms in China. Such concentration has generated an unexpected impact on the way we understand and appreciate creativity, on copyright enforcement and determination of liability on content platforms, and on the regulation of the cultural market by the government. Specifically, the concentration of power in content platforms has not only curbed direct online piracies to a large extent but has also accelerated the fragmentation of copyright enforcement and spawned the need for algorithmic recommendation and filtering systems, which in turn has reinforced the cultural censorship system of China. This paper argues that the employment of algorithms by platforms must be treated with prudence: the algorithmic decision-making systems employed by platforms must be transparent as much as possible, and remedies must be provided for concerned users. The algorithms employed by content platforms must be adjusted to reflect not just the interests of the platforms but also the public interest in accessing and delivering information and local policy considerations. This paper suggests that our regulatory framework should reflect the algorithmic turn of content platforms in its legal and non-legal instruments and alleviate their negative impact on society.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call