Not only semantic, but also recently learned arbitrary associations have the potential to facilitate visual processing in everyday life-for example, knowledge of a (moveable) object's location at a specific time may facilitate visual processing of that object. In our prior work, we showed that previewing a scene can facilitate processing of recently associated objects at the level of visual analysis (Smith and Federmeier in Journal of Cognitive Neuroscience, 32(5):783-803, 2020). In the current study, we assess how rapidly this facilitation unfolds by manipulating scene preview duration. We then compare our results to studies using well-learned object-scene associations in a first-pass assessment of whether systems consolidation might speed up high-level visual prediction. In two ERP experiments (N = 60), we had participants study categorically organized novel object-scene pairs in an explicit paired associate learning task. At test, we varied contextual pre-exposure duration, both between (200 vs. 2500 ms) and within subjects (0-2500 ms). We examined the N300, an event-related potential component linked to high-level visual processing of objects and scenes and found that N300 effects of scene congruity increase with longer scene previews, up to approximately 1-2 s. Similar results were obtained for response times and in a separate component-neutral ERP analysis of visual template matching. Our findings contrast with prior evidence that scenes can rapidly facilitate visual processing of commonly associated objects. This raises the possibility that systems consolidation might mediate different kinds of predictive processing with different temporal profiles.
Read full abstract