Abstract

Current models of attention propose that we can tune attention in a top-down controlled manner to a specific feature value (e.g., shape, color) to find specific items (e.g., a red car; feature-specific search). However, subsequent research has shown that attention is often tuned in a context-dependent manner to the relative features that distinguish a sought-after target from other surrounding nontarget items (e.g., larger, bluer, and faster; relational search). Currently, it is unknown whether search will be feature-specific or relational in search for multiple targets with different attributes. In the present study, observers had to search for 2 targets that differed either across 2 stimulus dimensions (color, motion; Experiment 1) or within the same stimulus dimension (color; Experiment 2: orange/redder or aqua/bluer). We distinguished between feature-specific and relational search by measuring eye movements to different types of irrelevant distractors (e.g., relatively matching vs. feature-matching). The results showed that attention was biased to the 2 relative features of the targets, both across different feature dimensions (i.e., motion and color) and within a single dimension (i.e., 2 colors; bluer and redder). The results were not due to automatic intertrial effects (dimension weighting or feature priming), and we found only small effects for valid precueing of the target feature, indicating that relational search for two targets was conducted with relative ease. This is the first demonstration that attention is top-down biased to the relative target features in dual target search, which shows that the relational account generalizes to multiple target search. (PsycInfo Database Record (c) 2020 APA, all rights reserved).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call