Abstract

Recently there has been an emphasis on developing augmented reality systems that can enhance the sensory signals listeners normally perceive in the real world. An early example of a practical augmented hearing system is the hearing aid, which amplifies the signals that would normally enter the ear in order to improve their audibility for hearing impaired listeners. Many applications are now envisioning the use of virtual audio technology to add synthetic, spatialized sounds to the “passthrough” audio signal that would normally be produced by a hearing aid. One challenge for the designers of both augmented and virtual audio systems is to ensure they preserve the ability of both normal and hearing-impaired listeners to determine the spatial locations of sounds in the environment. The auditory cues required for localizing sounds are well understood for normal hearing listeners, but less so for hearing impaired listeners. Here we discuss the results from two experiments where hearing-impaired listeners performed substantially worse than expected with virtual or augmented audio cues, and present preliminary results suggesting that the unexpectedly poor results from hearing-impaired listeners may be the result of a different weighting of localization cues across frequency regions than is typically observed for listeners with normal hearing.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.