Abstract
Visual processing varies dramatically across the visual field. These differences start in the retina and continue all the way to the visual cortex. Despite these differences in processing, the perceptual experience of humans is remarkably stable and continuous across the visual field. Research in the last decade has shown that processing in peripheral and foveal vision is not independent, but is more directly connected than previously thought. We address three core questions on how peripheral and foveal vision interact, and review recent findings on potentially related phenomena that could provide answers to these questions. First, how is the processing of peripheral and foveal signals related during fixation? Peripheral signals seem to be processed in foveal retinotopic areas to facilitate peripheral object recognition, and foveal information seems to be extrapolated toward the periphery to generate a homogeneous representation of the environment. Second, how are peripheral and foveal signals re-calibrated? Transsaccadic changes in object features lead to a reduction in the discrepancy between peripheral and foveal appearance. Third, how is peripheral and foveal information stitched together across saccades? Peripheral and foveal signals are integrated across saccadic eye movements to average percepts and to reduce uncertainty. Together, these findings illustrate that peripheral and foveal processing are closely connected, mastering the compromise between a large peripheral visual field and high resolution at the fovea.
Highlights
Visual processing varies dramatically across the visual field
How is the processing of peripheral and foveal signals related during fixation? Peripheral signals seem to be processed in foveal retinotopic areas to facilitate peripheral object recognition, and foveal information seems to be extrapolated toward the periphery to generate a homogeneous representation of the environment
Besides the mechanisms of inflation and extrapolation that we described in the previous section, this could be due to the fact that that foveal and peripheral vision are, to a certain extent, calibrated
Summary
The human eye is often compared to a photographic camera, processing across the visual field is not homogeneous like in a camera film or a digital sensor. Peripheral vision is subject to larger uncertainty in the localization of features and objects (Rentschler & Treutwein, 1985; Levi & Klein, 1986) This is impressively illustrated by models that produce spatially distorted images that cannot be distinguished from their original, undistorted images in the periphery (Balas, Nakano, & Rosenholtz, 2009; Freeman & Simoncelli, 2011; Koenderink, Valsecchi, van Doorn, Wagemans, & Gegenfurtner, 2017). Beyond these low-level effects, the peripheral visual field is more heavily affected by crowding (Korte, 1923; Bouma, 1970; for reviews see Levi, 2008; Pelli, 2008; Pelli & Tillman, 2008; Whitney & Levi, 2011; Strasburger, 2020). These interactions might be geared to enhance peripheral vision, thereby making peripheral and foveal vision more homogeneous
Published Version (
Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have