Abstract

Adaptation aftereffects are generally stronger for peripheral than for foveal viewing. We examined whether there are also differences in the dynamics of visual adaptation in central and peripheral vision. We tracked the time course of contrast adaptation to binocularly presented Gabor patterns in both the central visual field (within 5°) and in the periphery (beyond 10° eccentricity) using a yes/no detection task to monitor contrast thresholds. Consistent with previous studies, sensitivity losses were stronger in the periphery than in the center when adapting to equivalent high contrast (90% contrast) patterns. The time course of the threshold changes was fitted with separate exponential functions to estimate the time constants during the adapt and post-adapt phases. When adapting to equivalent high contrast, adaptation effects built up and decayed more slowly in the periphery compared with central adaptation. Surprisingly, the aftereffect in the periphery did not decay completely to the baseline within the monitored post-adapt period (400 s), and instead asymptoted to a higher level than for central adaptation. Even when contrast was reduced to one-third (30% contrast) of the central contrast, peripheral adaptation remained stronger and decayed more slowly. This slower dynamic was also confirmed at suprathreshold test contrasts by tracking tilt-aftereffects with a 2AFC orientation discrimination task. Our results indicate that the dynamics of contrast adaptation differ between central and peripheral vision, with the periphery adapting not only more strongly but also more slowly, and provide another example of potential qualitative processing differences between central and peripheral vision.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call