Abstract
Information about the form and the spatial location of objects is seamlessly integrated during visual perception. We used event-related potentials (ERPs) to explore neural activity related to processing form, location or the combination of both kinds of features. Healthy subjects performed three versions of a ‘match-to-sample’ task: a two-object task, a two-location task and an integrated object–location task. Responses were quickest and most accurate during the integrated task, slower and less accurate in the two-location task and slowest and least accurate in the two-object task. ERPs locked to the ‘sample’ stimulus at encoding, and to the ‘target’ stimulus during feature comparison differentiated between tasks. ‘Sample’ stimulus ERPs exhibited task-specific posterior cortical involvement in processing distinct visual features. ‘Target’ stimulus ERPs revealed task-related differences in features associated with frontal lobe mediated attentional processes: an early latency P300 showed increased amplitude during the integrated task. Results from this experiment support the view that distinct neural circuits mediate form vs. location processing and that form–location integration engages both pathways and upregulates frontal–parietal association networks.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have