Abstract

The robust representation of the environment from unreliable sensory cues is vital for the efficient function of the brain. However, how the neural processing captures the most reliable cues is unknown. The interaural time difference (ITD) is the primary cue to localize sound in horizontal space. ITD is encoded in the firing rate of neurons that detect interaural phase difference (IPD). Due to the filtering effect of the head, IPD for a given location varies depending on the environmental context. We found that, in barn owls, at each location there is a frequency range where the head filtering yields the most reliable IPDs across contexts. Remarkably, the frequency tuning of space-specific neurons in the owl's midbrain varies with their preferred sound location, matching the range that carries the most reliable IPD. Thus, frequency tuning in the owl's space-specific neurons reflects a higher-order feature of the code that captures cue reliability.

Highlights

  • Perception relies on sensory cues that are used by the brain to infer properties of the environment

  • To test whether weighting by reliability occurs in the owl's sound localization pathway, we first mapped the spatial and frequency tunings of the entire ICx. 177 single units obtained from 138 recording sites in the ICx of two adult barn owls were included in this analysis

  • To test whether the interaural phase difference (IPD) reliability is consistent with the pattern of interaural correlation induced by concurrent sounds, we examined the mean interaural correlation at the interaural time difference (ITD) of the target source within each frequency channel while concurrent sounds from other locations were presented

Read more

Summary

Introduction

Perception relies on sensory cues that are used by the brain to infer properties of the environment. Sound localization, relies on auditory spatial cues including phase differences of sounds between the ears (Moiseff and Konishi, 1983; Grothe et al, 2010). Contexts such as whether the environment is reverberant, quiet or noisy can influence spatial cues greatly. The presence of concurrent sounds can shift auditory cues used for localizing a target sound (Keller and Takahashi, 2005). This shift makes cues differ from what would be measured if the sound was emitted in a quiet environment. Cues associated with a given location must be similar across different contexts.

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call