Abstract

Real-world object size is a behaviorally relevant object property that is automatically retrieved when viewing object images: participants are faster to indicate the bigger of two object images when this object is also bigger in the real world. What drives this size Stroop effect? One possibility is that it reflects the automatic retrieval of real-world size after objects are recognized at the basic level (e.g., recognizing an object as a plane activates large real-world size). An alternative possibility is that the size Stroop effect is driven by automatic associations between low-/mid-level visual features (e.g., rectilinearity) and real-world size, bypassing object recognition. Here, we tested both accounts. In Experiment 1, objects were displayed upright and inverted, slowing down recognition while equating visual features. Inversion strongly reduced the Stroop effect, indicating that object recognition contributed to the Stroop effect. Independently of inversion, however, trial-wise differences in rectilinearity also contributed to the Stroop effect. In Experiment 2, the Stroop effect was compared between manmade objects (for which rectilinearity was associated with size) and animals (no association between rectilinearity and size). The Stroop effect was larger for animals than for manmade objects, indicating that rectilinear feature differences were not necessary for the Stroop effect. Finally, in Experiment 3, unrecognizable "texform" objects that maintained size-related visual feature differences were displayed upright and inverted. Results revealed a small Stroop effect for both upright and inverted conditions. Altogether, these results indicate that the size Stroop effect partly follows object recognition with an additional contribution from visual feature associations. (PsycInfo Database Record (c) 2024 APA, all rights reserved).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call