Abstract

The promise of virtual environments for studying sensory perception and cognition cannot be overstated. Because such environments can approximate aspects of the real world while affording a high degree of experimental and stimulus control, they promise to “bring the real world into the lab” and vice-versa [Stecker 2019, Hear J. 72(6):20-23], helping to make research and clinical testing more valid and relevant to real-world situations. Such benefits accrue to all sorts of investigations, and in this presentation we focus on relatively low-level aspects of auditory cognition—specifically the ability of listeners to differentiate sensory cues belonging to different objects in a complex scene (e.g., talkers) and to integrate information across disparate partially informative cues. Across multiple studies, we investigate the potential for virtual environments to enhance this work. A particular focus on simplified environments aims to elucidate a minimum feature set supporting different aspects of reality-like performance. [Work supported by US NIH R01-DC016643.]

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call