Abstract

ABSTRACTThis article provides a case study of the initial release of Apple’s wireless AirPods earbuds, and the subsequent implementation of a Live Listen feature (which in effect allowed the AirPods to perform basic functions of hearing aids), to explore the ways in which hearable technologies are attempting to integrate accessibility features for hard-of-hearing individuals alongside features of music playback and computer processing features that allow gesture-based interactions with these ear-worn devices. Particularly, this analysis argues that the process of rearticulating and expanding the possible uses and larger social understandings of ear-buds as being for both hearing and hard-of-hearing individuals has continued to marginalize those with hearing loss, building in software updates for them after initial releases and failing to account for their use of the devices in product launches and advertising materials. Positioning hearable technologies generally and AirPods specifically at the intersection of disabilities media studies and technology industry analysis, this research suggests that greater attention be paid to how companies like Apple use keynotes, developer conferences, and other trade rituals to suggest an ethos of accessibility around product launches despite rarely promoting or incorporating accessibility features at a product launch.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.