Abstract

While converging evidence implicates the right inferior parietal lobule in audiovisual integration, its role has not been fully elucidated by direct manipulation of cortical activity. Replicating and extending an experiment initially reported by Kamke et al. (2012), we employed the sound-induced flash illusion, in which a single visual flash, when accompanied by two auditory tones, is misperceived as multiple flashes (Wilson, 1987; Shams et al., 2000). Slow repetitive (1 Hz) TMS administered to the right angular gyrus, but not the right supramarginal gyrus, induced a transient decrease in the Peak Perceived Flashes (PPF), reflecting reduced susceptibility to the illusion. This finding independently confirms that perturbation of networks involved in multisensory integration can result in a more veridical representation of asynchronous auditory and visual events and that cross-modal integration is an active process in which the objective is the identification of a meaningful constellation of inputs, at times at the expense of accuracy.

Highlights

  • Audiovisual integration is a critical feature of sensory processing that allows for the creation of coherent percepts from disparate sensory streams

  • The effect of transcranial magnetic stimulation (TMS) to the angular gyrus (AG) on Peak Perceived Flashes (PPF)—as measured by the difference between the number of flashes perceived in the TMS and no TMS conditions—differed significantly from that observed after stimulation of the supramarginal gyrus (SMG) during illusion trials [paired t-test; t(8) = − 2.429, p = 0.041], but not double flash trials [t(8) = −1.045, p = 0.326] (Figure 4)

  • Consistent with the findings of Kamke et al (2012), focal cortical inhibition in our subjects produced more veridical visual perception on the sound-induced flash illusion task when applied to the AG but not to the SMG of the right hemisphere

Read more

Summary

Introduction

Audiovisual integration is a critical feature of sensory processing that allows for the creation of coherent percepts from disparate sensory streams. According to some recent models of temporal processing the right inferior parietal lobule (RIPL) may have a specific role in both unimodal and multimodal event order judgments (Snyder and Chatterjee, 2004; Battelli et al, 2007), and may contribute to the perception of synchrony between events across sensory modalities Consistent with this and with some prior imaging results, we described a patient with right parietal injury who acquired an isolated inability to integrate synchronous auditory and visual events, perceiving simultaneous stimuli (e.g., spoken speech sounds and congruent lip movements) as being mismatched in time (Hamilton et al, 2006; see Calvert, 2001; Bernstein et al, 2008)

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call