Abstract

Perception adapts to mismatching multisensory information, both when different cues appear simultaneously and when they appear sequentially. While both multisensory integration and adaptive trial-by-trial recalibration are central for behavior, it remains unknown whether they are mechanistically linked and arise from a common neural substrate. To relate the neural underpinnings of sensory integration and recalibration, we measured whole-brain magnetoencephalography while human participants performed an audio-visual ventriloquist task. Using single-trial multivariate analysis, we localized the perceptually-relevant encoding of multisensory information within and between trials. While we found neural signatures of multisensory integration within temporal and parietal regions, only medial superior parietal activity encoded past and current sensory information and mediated the perceptual recalibration within and between trials. These results highlight a common neural substrate of sensory integration and perceptual recalibration, and reveal a role of medial parietal regions in linking present and previous multisensory evidence to guide adaptive behavior.

Highlights

  • Multisensory information offers substantial benefits for behavior

  • To reveal the neural correlates of the ventriloquist aftereffect (VAE) we investigated three regression models capturing different aspects of how current and previous sensory information shape i) the neural encoding of current sensory information in the A trial (i.e. actual sound position (AA)), ii) the encoding of the upcoming response (RA), and iii) how neural representations of previous sensory information contribute to the single trial VAE bias

  • The significant effects of AAV and VAV overlapped in the cingulum and precuneus (Figure 3B; red inset). These results demonstrate that parietal regions represent information about previous multisensory stimuli, and this information affects the neural encoding of the currently perceived sound

Read more

Summary

Introduction

Multisensory information offers substantial benefits for behavior. For example, acoustic and visual cues can be combined to derive a more reliable estimate of where an object is located (Alais and Burr, 2004; Ernst and Banks, 2002; Kording et al, 2007; Wozny and Shams, 2011b). For example, the sight of the puppet and the actor’s voice are combined when localizing the speech source, and both cues influence the localization of subsequent unisensory acoustic cues, if probed experimentally (Bosen et al, 2017; Bosen et al, 2018; Bruns and Roder, 2015; Bruns and Roder, 2017; Callan et al, 2015; Radeau and Bertelson, 1974; Recanzone, 1998) This trial-by-trial recalibration of perception by previous multisensory information has been demonstrated for spatial cues, temporal cues, and speech signals (Kilian-Hutten et al, 2011a; Luttke et al, 2016; Luttke et al, 2018; Van der Burg et al, 2013), and has been shown to be modulated by attention (Eramudugolla et al, 2011). Despite the importance of both facets of multisensory perception for adaptive behavior - the combination of information within a trial and the trial-by-trial adjustment of perception - it remains unclear whether they originate from shared neural mechanisms

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call