Abstract

Virtual personal assistants (VPAs) are increasingly becoming a common aspect of everyday living. However, with female names, voices and characters, these devices appear to reproduce harmful gender stereotypes about the role of women in society and the type of work women perform. Designed to ‘assist’, VPAs – such as Apple's Siri and Amazon's Alexa – reproduce and reify the idea that women are subordinate to men, and exist to be ‘used’ by men. Despite their ubiquity, these aspects of their design have seen little critical attention in scholarship, and the potential legal responses to this issue have yet to be fully canvassed. Accordingly, this article sets out to critique the reproduction of negative gender stereotypes in VPAs and explores the provisions and findings within international women's rights law to assess both how this constitutes indirect discrimination and possible means for redress. In this regard, this article explores the obligation to protect women from discrimination at the hands of private actors under the Convention on the Elimination of All Forms of Discrimination Against Women, and the work of the Committee on Discrimination Against Women on gender stereotyping. With regard to corporate human rights responsibilities, the role of the United Nations Guiding Principles on Business and Human Rights is examined, as well as domestic enforcement mechanisms for international human rights norms and standards, noting the limitations to date in enforcing human rights compliance by multinational private actors.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call