Abstract

This paper examines crackdowns on queer content on TikTok and creative responses of content creators to circumvent biased content moderation and cisheteronormative censorship. The first portion of the paper demonstrates TikTok’s recurrent cisheteronormative biases in content moderation decisions and examines select instances of LGBTQ+ content that has been censored on the platform. It also works to situate this within a broader trend of LGBTQ+ censorship across internet platforms. The second portion of the paper examines how LGBTQ+ TikTok users have built up folk knowledges and intuitive understandings of TikTok’s blackboxed algorithms and opaque content moderation policies, situating this discussion within theories of the ‘algorithmic imaginary’. It catalogs the myriad ways that TikTok users work to circumvent LGBTQ+ censorship on the platform (e.g. by tactically obscuring key words in both speech and text and obscuring body parts and scenes). In the final portion of the paper, I draw on the concept of ‘cruising’ and other constitutive silences of LGBTQ+ existence to show how LGBTQ+ users are particularly well suited to producing folk knowledge about blackboxed algorithms. In closing, I examine the affordances and the limitations of LGBTQ+ users’ approach to navigating platform governance – and content moderation practices more specifically – as well as call for more organized and collective action in search of more permanent changes towards LGBTQ+-friendly platforms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call