Abstract

Studies of audiovisual perception of distance are rare. Here, visual and auditory cue interactions in distance are tested against several multisensory models, including a modified causal inference model. In this causal inference model predictions of estimate distributions are included. In our study, the audiovisual perception of distance was overall better explained by Bayesian causal inference than by other traditional models, such as sensory dominance and mandatory integration, and no interaction. Causal inference resolved with probability matching yielded the best fit to the data. Finally, we propose that sensory weights can also be estimated from causal inference. The analysis of the sensory weights allows us to obtain windows within which there is an interaction between the audiovisual stimuli. We find that the visual stimulus always contributes by more than 80% to the perception of visual distance. The visual stimulus also contributes by more than 50% to the perception of auditory distance, but only within a mobile window of interaction, which ranges from 1 to 4 m.

Highlights

  • Crossmodal interactions are often analyzed in light of the available multisensory perception theories of the time

  • Visual distance estimates had an average error of 0.33 m (0.51 SD)

  • The perceived visual distance was only affected by visual stimulus distance (χ2(9,5) = 1731.23, p = 0.000) and not by the auditory stimulus (χ2(9,5) = 0.28, p = 1.000)

Read more

Summary

Introduction

Crossmodal interactions are often analyzed in light of the available multisensory perception theories of the time. The Maximum Likelihood Estimation (MLE) model in particular, which assumes that this weighing process is statistically optimal, has been broadly tested and applied to a number of cue combination cases [8,9,10,11,12,13,14]

Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call