Abstract

Selective attention refers to the ability to focus on goal-relevant information while filtering out irrelevant information. In a multisensory context, how do people selectively attend to multiple inputs when making categorical decisions? Here, we examined the role of selective attention in cross-modal categorization in two experiments. In a speed categorization task, participants were asked to attend to visual or auditory targets and categorize them while ignoring other irrelevant stimuli. A response-time extended multinomial processing tree (RT-MPT) model was implemented to estimate the contribution of attentional focusing on task-relevant information and attentional filtering of distractors. The results indicated that the role of selective attention was modality-specific, with differences found in attentional focusing and filtering between visual and auditory modalities. Visual information could be focused on or filtered out more effectively, whereas auditory information was more difficult to filter out, causing greater interference with task-relevant performance. The findings suggest that selective attention plays a critical and differential role across modalities, which provides a novel and promising approach to understanding multisensory processing and attentional focusing and filtering mechanisms of categorical decision-making.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call