Abstract

Over the past year, many young creators who use the Chinese-owned social networking platform TikTok have claimed that its underlying algorithm surveils and suppresses the reach of content by Black, brown, fat, queer, and disabled creators. However, despite these algorithmic biases, these marginalized creators have continued to find new and ingenious ways to not only create but also successfully share anti-racist, anti-misogynistic, LGBTQIA+supportive, and body-positive content on the platform. Using this tension, this essay engages visual content analysis and critical technocultural discourse analysis to examine the innovative ways marginalized creators employ TikTok’s various medium and technological affordances to evade algorithmic surveillance and oppression. Building on Simone Browne’s concept of dark sousveillance, I theorize these practices as acts of digital dark sousveillance, defined within the essay as the use of digital tools to enact surveillance subversion, obfuscation, inversion while operating within systems of racializing surveillance.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.