Abstract

The human brain is inherently limited in the information it can make consciously accessible. When people monitor a rapid stream of visual items for two targets, they typically fail to see the second target if it occurs within 200–500 ms of the first, a phenomenon called the attentional blink (AB). The neural basis for the AB is poorly understood, partly because conventional neuroimaging techniques cannot resolve visual events displayed close together in time. Here we introduce an approach that characterises the precise effect of the AB on behaviour and neural activity. We employ multivariate encoding analyses to extract feature-selective information carried by randomly-oriented gratings. We show that feature selectivity is enhanced for correctly reported targets and suppressed when the same items are missed, whereas irrelevant distractor items are unaffected. The findings suggest that the AB involves both short- and long-range neural interactions between visual representations competing for access to consciousness.

Highlights

  • The human brain is inherently limited in the information it can make consciously accessible

  • Functional magnetic resonance imaging lacks the temporal resolution to accurately characterise neural activity associated with the rapid serial visual presentation (RSVP) tasks presented at rates of 8–12 Hz, which are commonly used to elicit the AB6,7

  • This finding is consistent with the behavioural results, which suggest a discrete model of the attentional blink (AB). These results indicate that the AB is associated with a reduction in gain, but not width, of feature-selective information for the second target item (T2), and that this effect occurs soon after the target appears within the RSVP stream

Read more

Summary

Introduction

The human brain is inherently limited in the information it can make consciously accessible. Mass-univariate approaches applied to fMRI or EEG data only measure overall neural activity while providing no information about how neural activity represents featural information carried by single items (e.g., their orientation) We overcome these limitations by combining recently developed multivariate modelling techniques for neuroimaging[9,10,11,12,13,14,15,16] with an RSVP task designed to determine the neural and behavioural basis for the AB. Forward (or inverted) encoding modelling determines the neural representation of feature-selective information contained within patterns of brain activity, using multivariate linear regression This approach allowed us to explicitly measure the neural representation of specific features—in this case, orientation-selective information elicited by grating stimuli— separately for each item within an entire RSVP stream. These measures cannot determine how the AB affects the neural representation of visual information, which could conceivably reflect a reduction in gain, an increase in tuning sharpness, or both

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call