Predictive updating of an object's spatial coordinates from pre-saccade to post-saccade contributes to stable visual perception. Whether object features are predictively remapped remains contested. We set out to characterise the spatiotemporal dynamics of feature processing during stable fixation and active vision. To do so, we applied multivariate decoding methods to electroencephalography (EEG) data collected while human participants (male and female) viewed brief visual stimuli. Stimuli appeared at different locations across the visual field at either high or low spatial frequency (SF). During fixation, classifiers were trained to decode SF presented at one parafoveal location and cross-tested on SF from either the same, adjacent or more peripheral locations. When training and testing on the same location, SF was classified shortly after stimulus onset (∼79 ms). Decoding of SF at locations farther from the trained location emerged later (∼144 - 295 ms), with decoding latency modulated by eccentricity. This analysis provides a detailed time course for the spread of feature information across the visual field. Next, we investigated how active vision impacts the emergence of SF information. In the presence of a saccade, the decoding time of peripheral SF at parafoveal locations was earlier, indicating predictive anticipation of SF due to the saccade. Crucially however, this predictive effect was not limited to the specific remapped location. Rather, peripheral SF was correctly classified, at an accelerated time course, at all parafoveal positions. This indicates spatially coarse, predictive anticipation of stimulus features during active vision, likely enabling a smooth transition on saccade landing.Significance Statement Maintaining a continuous representation of object features across saccades is vital for stable vision. In order to characterise the spatiotemporal dynamics of stimulus feature representation in the brain, we presented stimuli at a high and low spatial frequency at multiple locations across the visual field. Applying EEG-decoding methods we tracked the neural representation of spatial frequency during both stable fixation and active vision. Using this approach, we provide a detailed time course for the spread of feature information across the visual field during fixation. In addition, when a saccade is imminent, we show that peripheral spatial frequency is predictively represented in anticipation of the post-saccadic input.
Read full abstract