Abstract

The inferior colliculus (IC) of the midbrain is important for complex sound processing, such as discriminating conspecific vocalizations and human speech. The IC's non-lemniscal, dorsal "shell" region is likely important for this process, as neurons in these layers project to higher-order thalamic nuclei that subsequently funnel acoustic signals to the amygdala and non-primary auditory cortices; forebrain circuits important for vocalization coding in a variety of mammals, including humans. However, the extent to which shell IC neurons transmit acoustic features necessary to discern vocalizations is less clear, owing to the technical difficulty of recording from neurons in the IC's superficial layers via traditional approaches. Here we use 2-photon Ca2+ imaging in mice of either sex to test how shell IC neuron populations encode the rate and depth of amplitude modulation, important sound cues for speech perception. Most shell IC neurons were broadly tuned, with a low neurometric discrimination of amplitude modulation rate; only a subset were highly selective to specific modulation rates. Nevertheless, neural network classifier trained on fluorescence data from shell IC neuron populations accurately classified amplitude modulation rate, and decoding accuracy was only marginally reduced when highly tuned neurons were omitted from training data. Rather, classifier accuracy increased monotonically with the modulation depth of the training data, such that classifiers trained on full-depth modulated sounds had median decoding errors of ~0.2 octaves. Thus, shell IC neurons may transmit time-varying signals via a population code, with perhaps limited reliance on the discriminative capacity of any individual neuron.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call