Abstract

Humans coordinate their focus of attention with others, either by gaze following or prior agreement. Though the effects of joint attention on perceptual and cognitive processing tend to be examined in purely visual environments, they should also show in multisensory settings. According to a prevalent hypothesis, joint attention enhances visual information encoding and processing, over and above individual attention. If two individuals jointly attend to the visual components of an audiovisual event, this should affect the weighing of visual information during multisensory integration. We tested this prediction in this preregistered study, using the well-documented sound-induced flash illusions, where the integration of an incongruent number of visual flashes and auditory beeps results in a single flash being seen as two (fission illusion) and two flashes as one (fusion illusion). Participants were asked to count flashes either alone or together, and expected to be less prone to both fission and fusion illusions when they jointly attended to the visual targets. However, illusions were as frequent when people attended to the flashes alone or with someone else, even though they responded faster during joint attention. Our results reveal the limitations of the theory that joint attention enhances visual processing as it does not affect temporal audiovisual integration.

Highlights

  • Humans coordinate their focus of attention with others, either by gaze following or prior agreement

  • We found no significant difference, t(48) = −0.45, corrected p = 1, Cohen’s d = 0.06. As these results suggest that engaging in joint attention does not affect susceptibility to the fission illusion, we computed Bayes factors (BF) for this effect to assess relative likelihoods of the null (H0) and alternative (H1) hypotheses

  • We investigated whether the hypothesis that joint attention can boost relative processing of co-attended sensory stimuli compared with solo attention (Becchio et al, 2008; Mundy, 2016, 2018; Shteynberg, 2015, 2018) extends to temporal multisensory integration

Read more

Summary

Introduction

Humans coordinate their focus of attention with others, either by gaze following or prior agreement. Soto-Faraco et al (2005) and De Jong and Dijkerman (2019) both report that people are better at detecting and discriminating tactile stimuli on a body location when it is attended by another observer, represented by eye gaze cues. Extending these results, Nuku and Bekkering (2010) show that gaze cues from a virtual partner influence spatial auditory judgements, but only if the partner can hear the sounds

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call