Abstract

Many techniques have been developed to visualize how an image would appear to an individual with a different visual sensitivity: e.g., because of optical or age differences, or a color deficiency or disease. This protocol describes a technique for incorporating sensory adaptation into the simulations. The protocol is illustrated with the example of color vision, but is generally applicable to any form of visual adaptation. The protocol uses a simple model of human color vision based on standard and plausible assumptions about the retinal and cortical mechanisms encoding color and how these adjust their sensitivity to both the average color and range of color in the prevailing stimulus. The gains of the mechanisms are adapted so that their mean response under one context is equated for a different context. The simulations help reveal the theoretical limits of adaptation and generate "adapted images" that are optimally matched to a specific environment or observer. They also provide a common metric for exploring the effects of adaptation within different observers or different environments. Characterizing visual perception and performance with these images provides a novel tool for studying the functions and consequences of long-term adaptation in vision or other sensory systems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call