Abstract

The relative timing of auditory and visual stimuli is a critical cue for determining whether sensory signals relate to a common source and for making inferences about causality. However, the way in which the brain represents temporal relationships remains poorly understood. Recent studies indicate that our perception of multisensory timing is flexible—adaptation to a regular inter-modal delay alters the point at which subsequent stimuli are judged to be simultaneous. Here, we measure the effect of audio-visual asynchrony adaptation on the perception of a wide range of sub-second temporal relationships. We find distinctive patterns of induced biases that are inconsistent with the previous explanations based on changes in perceptual latency. Instead, our results can be well accounted for by a neural population coding model in which: (i) relative audio-visual timing is represented by the distributed activity across a relatively small number of neurons tuned to different delays; (ii) the algorithm for reading out this population code is efficient, but subject to biases owing to under-sampling; and (iii) the effect of adaptation is to modify neuronal response gain. These results suggest that multisensory timing information is represented by a dedicated population code and that shifts in perceived simultaneity following asynchrony adaptation arise from analogous neural processes to well-known perceptual after-effects.

Highlights

  • We typically perceive external events as coherent multisensory entities

  • In contrast to the uniform recalibration predicted by changes in sensory processing latency, we find that the magnitude of induced biases varies systematically as a function of the difference in stimulus-onset asynchrony (SOA) between adapting and test stimuli

  • To avoid potential problems encountered at the extremes of the sampled SOA interval (e.g. ‘clipping’ of estimates that would have fallen outside the range), subsequent analysis was restricted to the range between 2200 ms and þ200 ms

Read more

Summary

Introduction

We typically perceive external events as coherent multisensory entities. When a balloon pops in front of us, for example, we see and hear it happen simultaneously. In contrast to the uniform recalibration predicted by changes in sensory processing latency, we find that the magnitude of induced biases varies systematically as a function of the difference in stimulus-onset asynchrony (SOA) between adapting and test stimuli.

Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.