Abstract

The prevalence of artificial intelligence (AI) -driven virtual personal assistants (VPAs), both in the home and in businesses, is increasing. Yet, the key VPAs on the market today – Siri (Apple), Alexa (Amazon) and Cortana (Microsoft) – appear to be gendered female. This gendering takes place not just through their designation with female names which coincide with mythical and stereotyped views on gender (Siri is a Nordic name meaning “the beautiful woman that leads you to victory”), but also through female voices that users find more comfortable to instruct and give orders to than a male voice, and through witty and flirtatious characters revealed through their programmed responses to even the most perverse questions. It is, therefore, a gendering which is problematic: depicting the category “female” as an assistant – or secondary – to their male counterparts. Noting the post-phenomenological arguments set out by Mireille Hildebrandt that the technologies we use not only reflect and embed our presumptions and social biases, but also reproduce them in new ways that have material effects on us, we explore how the gendering of AI-driven VPAs poses a critical social harm by, as Julie Cohen describes in relation to Hildebrandt’s contentions, ‘continually, imminently mediating and pre-empting our beliefs and choices’ about the role of women in society. More critically, and in response to what we argue to be a social harm caused by the gendering of AI-driven VPAs produced by US-based companies Apple, Microsoft and Amazon, we explore the role and mandate of the Federal Trade Commission (FTC) as the broadly-mandated regulatory body for consumer protection. In particular, we analyse two distinct functions of the FTC. The first, its role with regard to the protection of data privacy and the extent to which data privacy impact assessments that necessitate an investigation of the social impact of new technologies beyond the privacy paradigm, can be drawn on here as a potential solution. And the second, the role of the FTC to investigate, protect and prevent ‘unfair or deceptive acts of practices in commerce’, and the extent to which the gendering of AI-driven VPAs can be gainsaid to constitute an unfair commercial practice.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.