Abstract
The paper reports on the ability of people to rapidly adapt in localizing virtual sound sources in both azimuth and elevation when listening to sounds synthesized using non-individualized head-related transfer functions (HRTFs). Participants were placed within an audio-kinesthetic Virtual Auditory Environment (VAE) platform that allows association of the physical position of a virtual sound source with an alternate set of acoustic spectral cues through the use of a tracked physical ball manipulated by the subject. This set-up offers a natural perception-action coupling, which is not limited to the visual field of view. The experiment consisted of three sessions: an initial localization test to evaluate participants' performance, an adaptation session, and a subsequent localization test. A reference control group was included using individual measured HRTFs. Results show significant improvement in localization performance. Relative to the control group, participants using non-individual HRTFs reduced localization errors in elevation by 10° with three sessions of 12 min. No significant improvement was found for azimuthal errors or for single session adaptation.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.