Abstract

Momentary lapses in attention disrupt goal-directed behaviors, and have been associated with increased pre-stimulus activity in the default mode network (DMN). The human brain often encounters multisensory inputs. It remains unknown, however, whether the neural mechanisms underlying attentional lapses are supra-modal or modality-dependent. To answer this question in the present functional magnetic resonance imaging (fMRI) study, we asked participants to respond to either visual or auditory targets in a multisensory paradigm, and focused on the pre-stimulus neural signals underlying attentional lapses, which resulted in impaired task performance, in terms of both delayed RTs and behavioral errors, in different sensory modalities. Behaviorally, mean reaction times (RTs) were equivalent between the visual and auditory modality. At the neural level, increased pre-stimulus neural activity in the majority of the core DMN regions, including medial prefrontal cortex (mPFC), posterior cingulate cortex (PCC), and left angular gyrus (AG), predicted delayed RTs more effectively in the visual than auditory modality. Especially, increased pre-stimulus activity in the mPFC predicted not only delayed RTs but also errors, more effectively in the visual than auditory modality. On the other hand, increased pre-stimulus activity in the anterior precuneus predicted both prolonged RTs and errors more effectively in the auditory than visual modality. Moreover, a supra-modal mechanism was revealed in the left middle temporal gyrus (MTG), which belongs to the posterior DMN. Increased pre-stimulus neural activity in the left MTG predicted impaired task performance in both the visual and auditory modality. Taken together, the core DMN regions manifest vision-dependent mechanisms of attentional lapses while a novel region in the anterior precuneus shows audition-dependent mechanisms of attentional lapses. Moreover, left MTG in the posterior DMN manifests a supra-modal mechanism of attentional lapses, independent of the modality of sensory inputs.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call