Abstract

As a popular social media platform, TikTok relies heavily on User-Generated Content (UGC), which is also one of its main attractions. The rapid production of UGC has implications for Tiktok's ability to moderate a large amount of content in a short time. Even though algorithmic technology and automatic filtering are used to assist this process, a moderator team that is responsive and efficient in carrying out moderation actions is needed. This need has implications for the growth of the Commercial Content Moderation (CCM) industry, where Tiktok employs a team of moderators to check and moderate content uploaded by users. The research aims to explore the complexities and unique aspects of Indonesian Tiktok content moderation and its implications for the development of UGC. This research uses an in-depth interview method approach with Tiktok Moderator Team informants as primary data, as well as related secondary uses as supporting data. The research focuses on how moderation practices are carried out by the Indonesian Tiktok Moderator Team in handling content related to vulnerable groups in society such as women, children, and the elderly, who in the last seven years have colored the dynamics of Tiktok in Indonesia. By highlighting the challenges faced by the Moderator Team in finding a balance between giving users freedom of expression and maintaining the security of the Tiktok platform, this research offers recommendations for improving content moderation practices on the platform, especially for the protection of these vulnerable groups.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call