Abstract

The human brain exhibits a highly adaptive ability to reduce natural asynchronies between visual and auditory signals. Even though this mechanism robustly modulates the subsequent perception of sounds and visual stimuli, it is still unclear how such a temporal realignment is attained. In the present study, we investigated whether or not temporal adaptation generalizes across different auditory frequencies. In a first exposure phase, participants adapted to a fixed 220-ms audiovisual asynchrony or else to synchrony for 3 min. In a second phase, the participants performed simultaneity judgments (SJs) regarding pairs of audiovisual stimuli that were presented at different stimulus onset asynchronies (SOAs) and included either the same tone as in the exposure phase (a 250 Hz beep), another low-pitched beep (300 Hz), or a high-pitched beep (2500 Hz). Temporal realignment was always observed (when comparing SJ performance after exposure to asynchrony vs. synchrony), regardless of the frequency of the sound tested. This suggests that temporal recalibration influences the audiovisual perception of sounds in a frequency non-specific manner and may imply the participation of non-primary perceptual areas of the brain that are not constrained by certain physical features such as sound frequency.

Highlights

  • Audiovisual signals referring to the same external event often arrive asynchronously to their corresponding perceptual brain areas

  • In a subsequent test phase, the participants performed a simultaneity judgment (SJ) task regarding auditory and visual stimuli presented at nine different stimulus onset asynchronies (SOAs) using the method of constant stimuli (±330, ±190, ±105, ±45, or 0 ms; negative values indicate that the sound was presented first)

  • Further analyses revealed that while the difference in point of subjective simultaneity (PSS) after exposure to synchrony and asynchrony was statistically equivalent for the adapted test tone and the 300 Hz test tone (Wilcoxon signed-rank test: Z = −0.17, p = 0.87), there was a difference between the adapted (250 Hz) and the 2500 Hz-tone conditions (Wilcoxon signed-rank test: Z = −2.4, p = 0.018)

Read more

Summary

Introduction

Audiovisual signals referring to the same external event often arrive asynchronously to their corresponding perceptual brain areas. This is both because different kinds of energy (e.g., light and sound waves) do not travel at the same velocity through the air (300,000,000 vs 340 m/s, respectively), and because the speed of neural transmission is different for vision and audition (see Spence and Squire, 2003; Schroeder and Foxe, 2004; King, 2005). The very existence of such a temporal window, and the fact that more asynchrony is tolerated when visual inputs lead auditory inputs (as when we perceive distant events) could be seen as an adaptative consequence of perceiving an intrinsically asynchronous external world

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call