Abstract

Echolocating big brown bats (Eptesicus fuscus) perceive their surroundings by broadcasting frequency-modulated (FM) ultrasonic pulses and processing returning echoes. Bats echolocate in acoustically cluttered environments containing multiple objects, where each broadcast is followed by multiple echoes at varying time delays. The bat must decipher this complex echo cascade to form a coherent picture of the entire acoustic scene. Neurons in the bat's inferior colliculus (IC) are selective for specific acoustic features of echoes and time delays between broadcasts and echoes. Because of this selectivity, different subpopulations of neurons are activated as the bat flies through its environment, while the physical scene itself remains unchanging. We asked how a neural representation based on variable single-neuron responses could underlie a cohesive perceptual representation of a complex scene. We recorded local field potentials from the IC of big brown bats to examine population coding of echo cascades similar to what the bat might encounter when flying alongside vegetation. We found that the temporal patterning of a simulated broadcast followed by an echo cascade is faithfully reproduced in the population response at multiple stimulus amplitudes and echo delays. Local field potentials to broadcasts and echo cascades undergo amplitude-latency trading consistent with single-neuron data but rarely show paradoxical latency shifts. Population responses to the entire echo cascade move as a unit coherently in time as broadcast-echo cascade delay changes, suggesting that these responses serve as an index for the formation of a cohesive perceptual representation of an acoustic scene.NEW & NOTEWORTHY Echolocating bats navigate through cluttered environments that return cascades of echoes in response to the bat's broadcasts. We show that local field potentials from the big brown bat's auditory midbrain have consistent responses to a simulated echo cascade varying across echo delays and stimulus amplitudes, despite different underlying individual neuronal selectivities. These results suggest that population activity in the midbrain can build a cohesive percept of an auditory scene by aggregating activity over neuronal subpopulations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call