Abstract

Emerging techniques for assessment and training of spatial hearing use virtual reality (VR) devices to present auditory-visual cues. Sounds may be processed using head-related transfer functions (HRTFs), which capture directional acoustic cues, and visuals may be presented using a head-mounted display (HMD). For users of hearing devices such as hearing aids or cochlear implants, microphone placement may alter the binaural cues carried by the HRTF. Cues may also be distorted in presentations that combine HMD visuals with loudspeaker audio. This work seeks to understand these impacts by measuring and creating a database of HRTFs in the presence of hearing aids and HMDs. HRTFs were recorded within an anechoic chamber, using a standard manikin in a loudspeaker array located on the horizontal plane. Recordings were made with real sound sources of 5.625° resolution and virtual sources of 1° resolution. We will compare interaural time and level differences (ITDs and ILDs) from HRTFs measured through hearing-aid microphones with varied microphone placement (i.e., behind-the-ear and in-the-ear) and the presence versus absence of an HMD during the HRTF recording. Results will provide insights regarding HRTFs for individuals with hearing devices, as well as characterization of the HMD impact on VR-guided individual HRTF measurement.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call