Abstract

Shifting attention from one color to another color or from color to another feature dimension such as orientation is imperative when searching for a certain object in a cluttered scene. Most attention models that emphasize feature-based selection implicitly assume that within- and cross-dimensional shifts take equally long. In contrast, the dimensional weighting account (DWA) predicts that cross-dimensional shifts take longer than shifts within a dimension because attentional weights need to be shifted not only between features but also from one dimension to another. We investigated whether shifting costs between feature dimensions is a general mechanism beyond singleton feature search in a non-search task. To this end, we recorded time courses of behavioral data and steady state visual evoked potentials (SSVEPs) in human EEG that provide an objective electrophysiological measure of neural dynamics in early visual cortex. SSVEPs were elicited by four random dot kinematograms (RDKs) that flickered at different frequencies. Each RDK comprised dashes that uniquely combined two features from dimensions color (red or blue) and orientation (slash or backslash). Subjects were cued to either shift attention within a feature dimension (e.g. from color to color) or cross-dimensionally (e.g. from color to orientation) and to detect and respond to short coherent motion events of a subset of dots at the to-be-attended dimension while ignoring all other events. For behavioral as well as electrophysiological data we found that shifts between feature dimensions took longer than shifts within dimensions. Interestingly, shifts towards color were took less time than shifts towards orientation, regardless of the initially to-be-attended feature. In conclusion: While the first result is in line with the DWA the second might reflect – in line with previous studies – that color and orientation are preferentially processed at different stages of the visual cortical hierarchy that receive attentional top-down modulations with different latencies. Meeting abstract presented at VSS 2015

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call