Abstract

To extract the global structure of an image, the visual system must integrate local orientation estimates across space. Progress is being made toward understanding this integration process, but very little is known about whether the presence of structure exerts a reciprocal influence on local orientation coding. We have previously shown that adaptation to patterns containing circular or radial structure induces tilt-aftereffects (TAEs), even in locations where the adapting pattern was occluded. These spatially “remote” TAEs have novel tuning properties and behave in a manner consistent with adaptation to the local orientation implied by the circular structure (but not physically present) at a given test location. Here, by manipulating the spatial distribution of local elements in noisy circular textures, we demonstrate that remote TAEs are driven by the extrapolation of orientation structure over remarkably large regions of visual space (more than 20°). We further show that these effects are not specific to adapting stimuli with polar orientation structure, but require a gradient of orientation change across space. Our results suggest that mechanisms of visual adaptation exploit orientation gradients to predict the local pattern content of unfilled regions of space.

Highlights

  • Analysis of orientation structure is fundamental to many aspects of visual perception, including the ability to parse the retinal image into distinct regions and identify the form of different objects

  • Neither observer displayed a remote TAE when the width of each iso-oriented band exceeded ∼12◦ of visual angle. This spatial band width corresponds to one third of the period of the underlying orientation gradient, meaning that each cycle of orientation change is signaled by three discrete orientations separated by 60◦

  • In the present study we investigated the effect of adapting to images containing spatially-extensive orientation structure on the perceived orientation of small test stimuli

Read more

Summary

Introduction

Analysis of orientation structure is fundamental to many aspects of visual perception, including the ability to parse the retinal image into distinct regions and identify the form of different objects. To achieve these goals, the visual system must first encode local orientation signals at different points in the visual field before integrating this information across space. Interaction between neighboring neurons provides a potential means to begin extracting orientation structure beyond the spatial constraints of an individual receptive field. It is likely that more complex and spatially extensive orientation structure analysis relies upon the progressive convergence of V1 outputs in extra-striate visual areas

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call