Abstract

In a naturalistic environment, auditory cues are often accompanied by information from other senses, which can be redundant with or complementary to the auditory information. Although the multisensory interactions derived from this combination of information and that shape auditory function are seen across all sensory modalities, our greatest body of knowledge to date centers on how vision influences audition. In this review, we attempt to capture the state of our understanding at this point in time regarding this topic. Following a general introduction, the review is divided into 5 sections. In the first section, we review the psychophysical evidence in humans regarding vision’s influence in audition, making the distinction between vision’s ability to enhance versus alter auditory performance and perception. Three examples are then described that serve to highlight vision’s ability to modulate auditory processes: spatial ventriloquism, cross-modal dynamic capture, and the McGurk effect. The final part of this section discusses models that have been built based on available psychophysical data and that seek to provide greater mechanistic insights into how vision can impact audition. The second section reviews the extant neuroimaging and far-field imaging work on this topic, with a strong emphasis on the roles of feedforward and feedback processes, on imaging insights into the causal nature of audiovisual interactions, and on the limitations of current imaging-based approaches. These limitations point to a greater need for machine-learning-based decoding approaches toward understanding how auditory representations are shaped by vision. The third section reviews the wealth of neuroanatomical and neurophysiological data from animal models that highlights audiovisual interactions at the neuronal and circuit level in both subcortical and cortical structures. It also speaks to the functional significance of audiovisual interactions for two critically important facets of auditory perception—scene analysis and communication. The fourth section presents current evidence for alterations in audiovisual processes in three clinical conditions: autism, schizophrenia, and sensorineural hearing loss. These changes in audiovisual interactions are postulated to have cascading effects on higher-order domains of dysfunction in these conditions. The final section highlights ongoing work seeking to leverage our knowledge of audiovisual interactions to develop better remediation approaches to these sensory-based disorders, founded in concepts of perceptual plasticity in which vision has been shown to have the capacity to facilitate auditory learning.

Highlights

  • We live in a multisensory world, in which we are continually bombarded with sensory information from a variety of sources borne through various forms of environmental energy

  • These results suggest a greater reliance on visual cues during speech perception, which complements the above findings that V1/V2 is employed in cochlear implant (CI) users during auditory-only listening

  • The general theme of this work is that under naturalistic circumstances we are almost continually challenged with information coming from multiple senses, and that the brain makes use of both redundant and complementary information in order to generate adaptive behavioral benefits and to create a coherent perceptual reality

Read more

Summary

INTRODUCTION

We live in a multisensory world, in which we are continually bombarded with sensory information from a variety of sources borne through various forms of environmental energy. From a purely adaptive perspective, having information available from more than a single sense provides tremendous advantages, in terms of both the redundant and complementary information that is conveyed These benefits have been illustrated in a variety of tasks across almost all possible sensory combinations and have been shown to improve stimulus detection, localization, and response accuracy, as well as to speed responses. The two best studied sensory systems in regards to multisensory functions are the auditory and visual systems The reasons for this are many, but interest in these interactions likely stems from the extrapersonal nature of both senses (i.e., they are representing things happening at a distance from the body), the ease with which parametric manipulations of a number of stimulus dimensions can be carried out, and the well-characterized nature of these two senses and their associated brain organization. We can divide visual influences on auditory perception into two broad categories: perceptual enhancements in which task-relevant or task-irrelevant visual information improves performance on an auditory task, and perceptual alterations, in which conflicting (but task relevant) visual information can change the nature of the auditory percept

Vision Can Enhance Auditory Perceptual Performance
Vision Can Alter Auditory Perception
Spatial Ventriloquism
Crossmodal Dynamic Capture
McGurk Effect
Mechanistic Principles of Visual Influences on Auditory Perception
Limitations of Neuroimaging Studies and Striving Towards Solutions
Methods to Study Audiovisual Interactions in the Auditory System
Visual Inputs and Audiovisual Responses in Subcortical Regions
Hypothesized Function of Audiovisual Interactions in the Inferior Colliculus
Visual Inputs and Audiovisual Responses in Auditory Cortex
Auditory Scene Analysis
Processing of Communication Signals
Autism Spectrum Disorder
Sensorineural Hearing Loss and Cochlear Implant Users
Visual Facilitation of Auditory Spatial Learning
Mechanisms Underlying the Visual Facilitation of Auditory Perceptual Learning
Findings
Concluding Remarks
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call