Abstract

Humans’ ability to detect relevant sensory information while being engaged in a demanding task is crucial in daily life. Yet, limited attentional resources restrict information processing. To date, it is still debated whether there are distinct pools of attentional resources for each sensory modality and to what extent the process of multisensory integration is dependent on attentional resources. We addressed these two questions using a dual task paradigm. Specifically, participants performed a multiple object tracking task and a detection task either separately or simultaneously. In the detection task, participants were required to detect visual, auditory, or audiovisual stimuli at varying stimulus intensities that were adjusted using a staircase procedure. We found that tasks significantly interfered. However, the interference was about 50% lower when tasks were performed in separate sensory modalities than in the same sensory modality, suggesting that attentional resources are partly shared. Moreover, we found that perceptual sensitivities were significantly improved for audiovisual stimuli relative to unisensory stimuli regardless of whether attentional resources were diverted to the multiple object tracking task or not. Overall, the present study supports the view that attentional resource allocation in multisensory processing is task-dependent and suggests that multisensory benefits are not dependent on attentional resources.

Highlights

  • The environment provides far more sensory input than can be effectively processed by the human brain

  • The present study aims to investigate two research questions: (a) Are attentional resources shared or distinct when a detection task is performed in combination with a visuospatial task? and (b) Is multisensory integration in a detection task affected when attentional resources are diverted to a simultaneously presented visuospatial task?

  • We investigated two research questions: (a) Are attentional resources shared or distinct for the sensory modalities when a detection task is performed in combination with a visuospatial task? and (b) Is multisensory integration in a detection task affected when attentional resources are diverted to a visuospatial task?

Read more

Summary

Introduction

The environment provides far more sensory input than can be effectively processed by the human brain. Humans need to allocate attentional resources to locations in space (i.e., a spatial attention component of the task) and discriminate whether the attended location is a target or distractor (i.e., an object-based attention component of the task; Eimer, 2014; Ghorashi et al, 2010) These findings (Macdonald & Lavie, 2011; Raveh & Lavie, 2015) suggest that attentional resources required for visual stimulus discrimination are shared with the resources required for auditory stimulus detection. Given a visual search task has an object-based attention component (i.e., discriminating targets from distractors), it has not been explored whether performing a purely visuospatial task (i.e., without any requirement to discriminate stimulus features) affects the ability to detect auditory stimuli to the same degree as the ability to detect visual stimuli. If multisensory integration in a detection task is not dependent on visuospatial attentional resources, multisensory integration should not be affected by performing a MOT task simultaneously with the detection task

Participants
Experimental Setup
Staircase Procedure Prior to the Experiment
Experimental Procedure
Results
Discussion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call