Abstract
Author SummaryWhen faced with ecologically relevant stimuli in natural scenes, our brains need to coordinate information from multiple sensory systems in order to create accurate internal representations of the outside world. Unfortunately, we currently have little information about the neuronal mechanisms for this cross-modal processing during online sensory perception under natural conditions. Neurophysiological and human imaging studies are increasingly exploring the response properties elicited by natural scenes. In this study, we recorded magnetoencephalography (MEG) data from participants viewing audiovisual movie clips. We developed a phase coherence analysis technique that captures—in single trials of watching a movie—how the phase of cortical responses is tightly coupled to key aspects of stimulus dynamics. Remarkably, auditory cortex not only tracks auditory stimulus dynamics but also reflects dynamic aspects of the visual signal. Similarly, visual cortex mainly follows the visual properties of a stimulus, but also shows sensitivity to the auditory aspects of a scene. The critical finding is that cross-modal phase modulation appears to lie at the basis of this integrative processing. Continuous cross-modal phase modulation may permit the internal construction of behaviorally relevant stimuli. Our work therefore contributes to the understanding of how multi-sensory information is analyzed and represented in the human brain.
Highlights
We do not experience the world as parallel sensory streams; rather, the information extracted from different modalities fuses to form a seamlessly unified multi-sensory percept dynamically evolving over time
We developed a phase coherence analysis technique that captures—in single trials of watching a movie—how the phase of cortical responses is tightly coupled to key aspects of stimulus dynamics
Visual cortex mainly follows the visual properties of a stimulus, and shows sensitivity to the auditory aspects of a scene
Summary
We do not experience the world as parallel sensory streams; rather, the information extracted from different modalities fuses to form a seamlessly unified multi-sensory percept dynamically evolving over time. One typical view posits that multisensory integration occurs at later stages of cortical processing, subsequent to unisensory analysis. This view has been supported by studies showing that higher, ‘‘association’’ areas in temporal, parietal, and frontal cortices receive inputs from multiple unimodal areas [5,6,7,8] and respond to stimulation in manner that reflects multisensory convergence, for example with amplified or suppressed responses for multimodal over unimodal stimuli [9,10,11,12]. A growing body of evidence provides a complementary view, suggesting that cross-modal interaction is not restricted to association areas and can occur at early, putatively unisensory cortical processing stages [11,13]. Multisensory integration may operate through lateral cross-sensory modulation, and there exist multiple integration pathways beyond purely hierarchical convergence [12,28,29]
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.