Abstract

ABSTRACTReal-world visual search targets are frequently imperfect perceptual matches to our internal templates. For example, a friend on different occasions will have different clothes, hairstyles, and accessories, but some of these may vary more than others. The ability to deal with template-to-target variability is important to visual search in natural environments, but we know relatively little about how this is handled by the attentional system. Here, we test the hypothesis that top-down attentional biases are sensitive to the variance of target features and prioritize less-variable dimensions. Subjects were shown target cues composed of coloured dots moving in a specific direction followed by a working memory probe (30%) or visual search display (70%). Critically, the target features in the visual search display differed from the cue, with one feature drawn from a narrow distribution (low-variance dimension), and the other sampled from a broader distribution (high-variance dimension). The results demonstrate that subjects used knowledge of the likely cue-to-target variance to set template precision and bias attentional selection. Our results suggest that observers are sensitive to the variance of feature dimensions within a target and use this information to weight mechanisms of attentional selection.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call