This study employs semistructured interviews and algorithmic ethnography to explore how algorithmic shadowbans have been used to moderate content related to Chinese gay men and achieve targeted algorithmic governance. Through a multimethod approach combining both thematic analysis and discourse analysis, this study claims that algorithms impose seemingly tolerant but actually restrictive shadowbans, which are thematized as “(im)permissible searching” and “(un)smooth posting,” on Chinese gay men. This study conceptualizes such algorithmic shadowbans as “algorithmic camouflage,” emphasizing the opacity of the roles, behaviors, and purposes of algorithms toward specific users from an interactive perspective, highlighting the “hypocrisy” of algorithms. Under hypocritical algorithmic shadowbans, this study suggests that a highly camouflaged “de-gaying” discourse—through compositions of dehumanization, de-emotionalization, and dramatization—is being shaped by algorithms on Chinese digital platforms.
Read full abstract