Abstract

ABSTRACTThis article provides a case study of the initial release of Apple’s wireless AirPods earbuds, and the subsequent implementation of a Live Listen feature (which in effect allowed the AirPods to perform basic functions of hearing aids), to explore the ways in which hearable technologies are attempting to integrate accessibility features for hard-of-hearing individuals alongside features of music playback and computer processing features that allow gesture-based interactions with these ear-worn devices. Particularly, this analysis argues that the process of rearticulating and expanding the possible uses and larger social understandings of ear-buds as being for both hearing and hard-of-hearing individuals has continued to marginalize those with hearing loss, building in software updates for them after initial releases and failing to account for their use of the devices in product launches and advertising materials. Positioning hearable technologies generally and AirPods specifically at the intersection of disabilities media studies and technology industry analysis, this research suggests that greater attention be paid to how companies like Apple use keynotes, developer conferences, and other trade rituals to suggest an ethos of accessibility around product launches despite rarely promoting or incorporating accessibility features at a product launch.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call