Abstract

To synthesize a coherent representation of the external world, the brain must integrate inputs across different types of stimuli. Yet the mechanistic basis of this computation at the level of neuronal populations remains obscure. Here, we investigate tactile-auditory integration using two-photon Ca2+ imaging in the mouse primary (S1) and secondary (S2) somatosensory cortices. Pairing sound with whisker stimulation modulates tactile responses in both S1 and S2, with the most prominent modulation being robust inhibition in S2. The degree of inhibition depends on tactile stimulation frequency, with lower frequency responses the most severely attenuated. Alongside these neurons, we identify sound-selective neurons in S2 whose responses are inhibited by high tactile frequencies. These results are consistent with a hypothesized local mutually-inhibitory S2 circuit that spectrally selects tactile versus auditory inputs. Our findings enrich mechanistic understanding of multisensory integration and suggest a key role for S2 in combining auditory and tactile information.

Highlights

  • To synthesize a coherent representation of the external world, the brain must integrate inputs across different types of stimuli

  • Transgenic mice expressing the Ca2+-indicator GCaMP6s under pan-neuronal promoters[12,13,14] were implanted with a chronic cranial window exposing up to 5 mm of the left hemisphere[15,16]. This approach allowed us to capture the majority of the auditory cortex along with the primary somatosensory (S1), the secondary somatosensory (S2), and the insular somatosensory (ISF; insular somatosensory field) cortices all in the same window

  • Based on stereotaxic coordinates, the most medial locus was identified as S1, the middle locus corresponded to S2, and the most lateral locus was consistent with the location of ISF identified previously with intrinsic optical imaging[17]

Read more

Summary

Introduction

To synthesize a coherent representation of the external world, the brain must integrate inputs across different types of stimuli. A significant behavioral consequence of this interaction is observed when tactile and auditory inputs are not in register: as a human subject touches a surface, if a sound is played that is different from what is expected from the tactile sensation of the surface, the reported roughness of the surface is altered in a manner dependent on the sound frequency[6] This “parchmentskin illusion” points to the deep bond shared by these modalities and hints at the centrality of stimulus frequency as a parameter that may bind them together[7,8]. In S2, we find a small number of sound-selective neurons whose responses are reciprocally attenuated by high-frequency whisker deflections These frequency-dependent interactions point towards a spectrally-dependent mutually inhibitory circuit between touch-selective and sound-selective neurons and shed light on the neural circuits that may underlie the computations involved in multisensory integration

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call