Abstract

AbstractWith the rise of advanced biometric technologies, the surveilling of populations who do not match racial and gender norms has increased. Modern-day biometrics make assumptions about gender and race based on skin color, facial structure, body type, and body parts, which are encoded in predictive algorithms and other AI-driven systems. Growing empirical evidence points to the obstacles this poses for trans and non-binary individuals in several spheres, including border security, healthcare, and social media. Drawing on autoethnographic vignettes, semi-structured interviews, and survey responses, we look to the increased use of binary-based biometric technologies and automatic gender recognition (AGR), which rely on outmoded understandings of gender as static, measurable, and physiological. Our ethnographic data demonstrate how trans and non-binary bodies are forced to bend to these systems; meanwhile these technologies and algorithms increasingly extract data on trans and non-binary users, which may then be used as challenge sets to refine their accuracy.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.