Abstract

Algorithms in online platforms interact with users' identities in different ways. However, little is known about how users understand the interplay between identity and algorithmic processes on these platforms, and if and how such understandings shape their behavior on these platforms in return. Through semi-structured interviews with 15 US-based TikTok users, we detail users' algorithmic folk theories of the For You Page algorithm in relation to two inter-connected identity types: person and social identity. Participants identified potential harms that can accompany algorithms' tailoring content to their person identities. Further, they believed the algorithm actively suppresses content related to marginalized social identities based on race and ethnicity, body size and physical appearance, ability status, class status, LGBTQ identity, and political and social justice group affiliation. We propose a new algorithmic folk theory of social feeds-The Identity Strainer Theory-to describe when users believe an algorithm filters out and suppresses certain social identities. In developing this theory, we introduce the concept of algorithmic privilege as held by users positioned to benefit from algorithms on the basis of their identities. We further propose the concept of algorithmic representational harm to refer to the harm users experience when they lack algorithmic privilege and are subjected to algorithmic symbolic annihilation. Additionally, we describe how participants changed their behaviors to shape their algorithmic identities to align with how they understood themselves, as well as to resist the suppression of marginalized social identities and lack of algorithmic privilege via individual actions, collective actions, and altering their performances. We theorize our findings to detail the ways the platform's algorithm and its users co-produce knowledge of identity on the platform. We argue the relationship between users' algorithmic folk theories and identity are consequential for social media platforms, as it impacts users' experiences, behaviors, sense of belonging, and perceived ability to be seen, heard, and feel valued by others as mediated through algorithmic systems.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.