Chromatic induction and retinal image motion.

  • Abstract
  • Literature Map
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon
Take notes icon Take Notes

As the eyes drift across a scene, borders between surfaces slide across the retina. Consequently, near borders' edges, parts of the retina that have adapted to the light at one side of the border are exposed to the light at the other side of the border. Such changes in exposure might increase the judged contrast. Retinal image motion might therefore contribute to chromatic induction, the influence that adjacent colours have on a surface's apparent colour, by increasing the apparent colour contrast. We conducted two experiments to evaluate this possibility. The experiments examined how artificially increasing or decreasing the extent to which certain surface borders shift across the retina influences the perceived colour. Neither increasing nor decreasing the extent to which selected borders shift across the retina had a substantial influence on the perceived colour. This implies that chromatic induction does not arise from overestimating the contrast between adjacent surfaces when small eye movements shift the border between those surfaces across the retina.

Similar Papers
  • Research Article
  • Cite Count Icon 18
  • 10.1113/jphysiol.2013.258640
Velocity storage mechanism in zebrafish larvae
  • Dec 5, 2013
  • The Journal of Physiology
  • Chien‐Cheng Chen + 7 more

The optokinetic reflex (OKR) and the angular vestibulo-ocular reflex (aVOR) complement each other to stabilize images on the retina despite self- or world motion, a joint mechanism that is critical for effective vision. It is currently hypothesized that signals from both systems integrate, in a mathematical sense, in a network of neurons operating as a velocity storage mechanism (VSM). When exposed to a rotating visual surround, subjects display the OKR, slow following eye movements frequently interrupted by fast resetting eye movements. Subsequent to light-off during optokinetic stimulation, eye movements do not stop abruptly, but decay slowly, a phenomenon referred to as the optokinetic after-response (OKAR). The OKAR is most likely generated by the VSM. In this study, we observed the OKAR in developing larval zebrafish before the horizontal aVOR emerged. Our results suggest that the VSM develops prior to and without the need for a functional aVOR. It may be critical to ocular motor control in early development as it increases the efficiency of the OKR.

  • Research Article
  • Cite Count Icon 1
  • 10.1097/opx.0000000000001643
High- and Low-contrast Letter Acuity during Image Motion in Normal Observers and Observers with Infantile Nystagmus Syndrome.
  • Feb 1, 2021
  • Optometry and Vision Science
  • Harold E Bedell + 1 more

High-contrast acuity in individuals with infantile nystagmus syndrome (INS) is poorer than expected from their ongoing retinal image motion, indicating a sensory loss. Conversely, acuity for larger low-contrast letters in these observers may be limited by image motion alone. The aim of this study was to assess visual acuity for letters of different contrast in normal observers and individuals with idiopathic INS under conditions of comparable retinal image motion. Visual acuity was measured using projected Landolt C charts in 3 normal observers and 11 observers with presumed idiopathic INS. Normal observers viewed each chart after reflection from a front-surface mirror that underwent continuous 4-Hz ramp motion with amplitudes ranging from 4 to 9.6° and simulated foveation durations of 20 to 80 milliseconds. Observers with INS viewed the charts directly. By reciprocally varying the luminance of the projected charts and a superimposed veiling source, Landolt C's were presented on a background luminance of 43 cd/m2 with Weber contrasts between -12 and -89%. Whereas normal observers' high-contrast acuity during imposed image motion depends only on the duration of the simulated foveation periods, acuity for low-contrast optotypes also worsens systematically as motion intensity (frequency × amplitude) increases. For comparable parameters of retinal image motion, high-contrast acuity in all but one of the observers with INS was poorer than in normal observers. On the other hand, low-contrast acuity in the two groups of observers was similar when the retinal image motion was comparable. Reduced high-contrast acuity in observers with INS appears to be attributable primarily to a sensory deficit. On the other hand, the reduction of low-contrast acuity in observers with INS may be accounted for on the basis of retinal image motion.

  • Research Article
  • Cite Count Icon 39
  • 10.1167/iovs.09-4334
Investigating Unstable Fixation in Patients with Macular Disease
  • Mar 9, 2011
  • Investigative Opthalmology & Visual Science
  • Antonio F Macedo + 2 more

To assess the effect on visual acuity of compensating fixation instability by controlling retinal image motion in people with macular disease. Ten patients with macular disease participated in this study. Crowded and noncrowded visual acuity were measured using an eye tracking system to compensate for fixation instability. Four conditions, corresponding to four levels of retinal image motion, were tested: no compensation (normal motion), partial compensation (reduced motion), total compensation (no motion), and overcompensation (increased motion). Fixation stability and the number of preferred retinal loci were also measured. Modulating retinal image motion had the same effect on crowded and noncrowded visual acuity (P = 0.601). When fixation instability was overcompensated, acuity worsened by 0.1 logMAR units (P < 0.001) compared with baseline (no compensation) and remained equal to baseline for all other conditions. In people with macular disease, retinal image motion caused by fixation instability does not reduce either crowded or noncrowded visual acuity. Acuity declines when fixation instability is overcompensated, showing limited tolerance to increased retinal image motion. The results provide evidence that fixation instability does not improve visual acuity and may be a consequence of poor oculomotor control.

  • Conference Article
  • Cite Count Icon 1
  • 10.1364/vsia.1995.sub3
Dynamic Visual Acuity During Vertical Retinal Image Motion: Comparison of Normal and Low Vision
  • Jan 1, 1995
  • Joseph L Demer

Telescopic spectacles are potentially advantageous as aids for visually impaired patients, who might be expected to uniformly benefit from optical magnification. However, these relatively costly devices fail to benefit a substantial number of patients with low vision1. Based upon retrospective1 and prospective clinical data2, it has been hypothesized that a major cause of rehabilitation failure with telescopic spectacles is retinal image instability during involuntary head motion. This retinal slip hypothesis supposes that ubiquitous head motion, optically magnified by telescopic spectacles, overwhelms compensatory ocular motor reflexes to produce slipping motion of images on the retina, which degrades dynamic visual acuity (DVA) sufficiently to negate the magnification advantage of the telescopes. Several lines of evidence support this hypothesis. The visual-vestibulo-ocular reflex (VVOR) is the principal compensatory mechanism stabilizing the retina during head movements. The VVOR gain of both normally sighted3 and low vision subjects1 fails to increase sufficiently to match the magnification of telescopic spectacles, a situation presumed to result in retinal image motion during head motion. In normally sighted subjects, actual retinal image motion has been measured during DVA tasks conducted both with vertically moving optotypes, and during vertically imposed head motion4. That study found that, regardless of whether retinal image motion was produced by moving optotypes or moving head, DVA in normal subjects was independent of image velocity up to 2°/sec, but declined with the 0.6 power of image velocity for greater slip. The present investigation was conducted to determine if patients with low vision have a similar relationship between DVA and retinal image slip velocity.

  • Research Article
  • Cite Count Icon 14
  • 10.1152/jn.1999.82.2.551
Proprioceptive and retinal afference modify postsaccadic ocular drift.
  • Aug 1, 1999
  • Journal of neurophysiology
  • Richard F Lewis + 3 more

Drift of the eyes after saccades produces motion of images on the retina (retinal slip) that degrades visual acuity. In this study, we examined the contributions of proprioceptive and retinal afference to the suppression of postsaccadic drift induced by a unilateral ocular muscle paresis. Eye movements were recorded in three rhesus monkeys with a unilateral weakness of one vertical extraocular muscle before and after proprioceptive deafferentation of the paretic eye. Postsaccadic drift was examined in four visual states: monocular viewing with the normal eye (4-wk period); binocular viewing (2-wk period); binocular viewing with a disparity-reducing prism (2-wk period); and monocular viewing with the paretic eye (2-wk period). The muscle paresis produced vertical postsaccadic drift in the paretic eye, and this drift was suppressed in the binocular viewing condition even when the animals could not fuse. When the animals viewed binocularly with a disparity-reducing prism, the drift in the paretic eye was suppressed in two monkeys (with superior oblique pareses) but generally was enhanced in one animal (with a tenotomy of the inferior rectus). When drift movements were enhanced, they reduced the retinal disparity that was present at the end of the saccade. In the paretic-eye-viewing condition, postsaccadic drift was suppressed in the paretic eye and was induced in the normal eye. After deafferentation in the normal-eye-viewing state, there was a change in the vertical postsaccadic drift of the paretic eye. This change in drift was idiosyncratic and variably affected the amplitude and velocity of the postsaccadic drift movements of the paretic eye. Deafferentation of the paretic eye did not affect the postsaccadic drift of the normal eye nor did it impair visually mediated adaptation of postsaccadic drift. The results demonstrate several new findings concerning the roles of visual and proprioceptive afference in the control of postsaccadic drift: disconjugate adaptation of postsaccadic drift does not require binocular fusion; slow, postsaccadic drift movements that reduce retinal disparity but concurrently increase retinal slip can be induced in the binocular viewing state; postsaccadic drift is modified by proprioception from the extraocular muscles, but these modifications do not serve to minimize retinal slip or to correct errors in saccade amplitude; and visually mediated adaptation of postsaccadic drift does not require proprioceptive afference from the paretic eye.

  • Research Article
  • Cite Count Icon 2
  • 10.1167/19.4.2
The influence of retinal image motion on the perceptual grouping of temporally asynchronous stimuli
  • Apr 3, 2019
  • Journal of Vision
  • Adela S Y Park + 3 more

Briefly presented stimuli can reveal the lower limit of retinal-based perceptual stabilization mechanisms. This is demonstrated in perceptual grouping of temporally asynchronous stimuli, in which alternate row or column elements of a regular grid are presented over two successive display frames with an imperceptible temporal offset. The grouping phenomenon results from a subtle shift between alternate grid elements due to incomplete compensation of small, fixational eye movements occurring between the two presentation frames. This suggests that larger retinal shifts should amplify the introduced shifts between alternate grid elements and improve grouping performance. However, large shifts are necessarily absent in small eye movements. Furthermore, shifts follow a random walk, making the relationship between shift magnitude and performance difficult to explore systematically. Here, we established a systematic relationship between retinal image motion and perceptual grouping by presenting alternate grid elements (untracked) during smooth pursuit of known velocities. Our results show grouping performance to improve in direct proportion to pursuit velocity. Any potential compensation by extraretinal signals (e.g., efference copy) does not seem to occur.

  • Research Article
  • Cite Count Icon 366
  • 10.1038/nature05866
Miniature eye movements enhance fine spatial detail
  • Jun 1, 2007
  • Nature
  • Michele Rucci + 3 more

Our eyes are constantly in motion. Even during visual fixation, small eye movements continually jitter the location of gaze. It is known that visual percepts tend to fade when retinal image motion is eliminated in the laboratory. However, it has long been debated whether, during natural viewing, fixational eye movements have functions in addition to preventing the visual scene from fading. In this study, we analysed the influence in humans of fixational eye movements on the discrimination of gratings masked by noise that has a power spectrum similar to that of natural images. Using a new method of retinal image stabilization, we selectively eliminated the motion of the retinal image that normally occurs during the intersaccadic intervals of visual fixation. Here we show that fixational eye movements improve discrimination of high spatial frequency stimuli, but not of low spatial frequency stimuli. This improvement originates from the temporal modulations introduced by fixational eye movements in the visual input to the retina, which emphasize the high spatial frequency harmonics of the stimulus. In a natural visual world dominated by low spatial frequencies, fixational eye movements appear to constitute an effective sampling strategy by which the visual system enhances the processing of spatial detail.

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 1
  • 10.3389/frsip.2023.1133210
Apparent color picker: color prediction model to extract apparent color in photos
  • May 9, 2023
  • Frontiers in Signal Processing
  • Yuki Kubota + 2 more

A color extraction interface reflecting human color perception helps pick colors from natural images as users see. Apparent color in photos differs from pixel color due to complex factors, including color constancy and adjacent color. However, methodologies for estimating the apparent color in photos have yet to be proposed. In this paper, the authors investigate suitable model structures and features for constructing an apparent color picker, which extracts the apparent color from natural photos. Regression models were constructed based on the psychophysical dataset for given images to predict the apparent color from image features. The linear regression model incorporates features that reflect multi-scale adjacent colors. The evaluation experiments confirm that the estimated color was closer to the apparent color than the pixel color for an average of 70%–80% of the images. However, the accuracy decreased for several conditions, including low and high saturation at low luminance. The authors believe that the proposed methodology could be applied to develop user interfaces to compensate for the discrepancy between human perception and computer predictions.

  • Research Article
  • Cite Count Icon 141
  • 10.1007/bf00230534
A neuronal correlate of spatial stability during periods of self-induced visual motion
  • Sep 1, 1991
  • Experimental Brain Research
  • R.G Erickson + 1 more

Motion of background visual images across the retina during slow tracking eye movements is usually not consciously perceived so long as the retinal image motion results entirely from the voluntary slow eye movement (otherwise the surround would appear to move during pursuit eye movements). To address the question of where in the brain such filtering might occur, the responses of cells in 3 visuo-cortical areas of macaque monkeys were compared when retinal image motion of background images was caused by object motion as opposed to a pursuit eye movement. While almost all cells in areas V4 and MT responded indiscriminately to retinal image motion arising from any source, most of those recorded in the dorsal zone of area MST (MSTd), as well as a smaller proportion in lateral MST (MST1), responded preferentially to externally-induced motion and only weakly or not at all to self-induced visual motion. Such cells preserve visuo-spatial stability during low-velocity voluntary eye movements and could contribute to the process of providing consistent spatial orientation regardless of whether the eyes are moving or stationary.

  • Research Article
  • Cite Count Icon 185
  • 10.1152/jn.1988.59.1.19
The role of the posterior vermis of monkey cerebellum in smooth-pursuit eye movement control. II. Target velocity-related Purkinje cell activity.
  • Jan 1, 1988
  • Journal of Neurophysiology
  • D A Suzuki + 1 more

1. Purkinje cell activity was recorded from lobules VI and VII of the cerebellar vermis during the performance of visuooculomotor tasks designed to dissociate the signals related to head, smooth-pursuit eye, and retinal image movements. Task-related modulations in the simple spike discharge rates of 157 cells were observed in three alert monkeys. 2. Of 65 Purkinje cells that were completely tested for all three signals, all exhibited smooth-pursuit eye movement-related activity. An additional vestibular or visual response was observed in 17 and 11% of the cells, respectively. Eye, head, and retinal image velocity signals were all recorded in the same unit in 52% of the Purkinje cells. The responses of 5% of the fully tested cells were associated with changes in the direction of eye, head, and retinal image movement. 3. The observed sensorioculomotor responses were direction selective in 98% of the Purkinje cells. For the Purkinje cells that were fully tested, 60% of the cells exhibited peak discharge rates for ipsilateral and 40% for contralateral eye velocity. Of these Purkinje cells, 45% exhibited eye, head, and retinal image velocity signals with equivalent direction preferences. 4. Of 42 Purkinje cells tested, 88% demonstrated some kinds of interactive responses during combined eye and sensory stimulation. The interaction of eye and head velocity signals has been discussed in a companion paper (38). The modulation in discharge rate observed during tracking in the presence of a random dot background pattern could be predicted from the dissociated responses to smooth pursuit in the dark and to movements of the background pattern during suppression of eye movements. 5. The sensitivity to smooth-pursuit eye velocity averaged 1.4 times the sensitivity to head velocity. In 80% of the Purkinje cells, however, the sensitivity to eye velocity exceeded the sensitivity to head velocity by an average of only 10%. The sensitivity to smooth-pursuit eye velocity averaged 1.6 times the sensitivity to retinal image velocity. 6. An increase in Purkinje cell discharge rate was observed during the open-loop period of the initiation of smooth-pursuit eye movements. This open-loop response was consistent with the presence of a visual signal during ocular pursuit, since these cells were also shown to be responsive to a dissociated retinal image velocity signal. Furthermore, the magnitude of the open-loop response indicated an enhancement of the sensitivity to retinal image velocity when visual information became behaviorally significant.(ABSTRACT TRUNCATED AT 400 WORDS)

  • Book Chapter
  • Cite Count Icon 15
  • 10.1007/978-1-4757-9628-5_10
Motion Processing in Monkey Striate Cortex
  • Jan 1, 1994
  • Guy A Orban

Motion processing, i.e., the processing of retinal image movement, is of great importance for primates (for review, see, e.g., Nakayama, 1985). In fact, motion processing could be considered fundamental to vision since retinal images are always moving as a result of micro eye movements, essential for visual perception. However, retinal image motion, whether generated by micro or macro eye movements, including pursuit and saccades, contains no information about the outside world. This is not the case for retinal image motion generated by the subject’s own movements. The spatiotemporal changes in the retinal light distribution induced by relative movement between the observer and the environment, generated either by object motion or by self motion, are referred to as optic flow. Optic flow is a rich source of information about the outside world. It provides information about the 3-D trajectory of moving objects of the moving subject as well as about the 3-D structure of the environment. Furthermore, motion is a clear signal for image segmentation and perceptual grouping. In addition to its many perceptual uses, retinal motion also contributes to the control of eye movements, saccades as well as pursuit and optokinetic nystagmus. The term motion processing generally refers to the analysis of retinal image motion inasmuch as this leads to control of eye position and to extraction of information about the outside world.

  • Research Article
  • Cite Count Icon 8
  • 10.1167/5.8.590
Miniature eye movements measured simultaneously with ophthalmic imaging and a dual-Purkinje image eye tracker
  • Mar 17, 2010
  • Journal of Vision
  • S B Stevenson + 1 more

Background: Scanning laser ophthalmoscopes with adaptive optics correction of ocular aberrations provide retinal images of unprecedented resolution, allowing for real-time imaging of photoreceptors. Eye movements made by the subject/patient during recording produce distortions that must be corrected before multiple frames can be added together to achieve noise reduction or to build a mosaic image from different retinal areas. These distortions also provide a high spatial and temporal resolution record of the miniature eye movements made during fixation. Here we report simultaneous measurements of fixation eye movements with an Adaptive Optics Scanning Laser Ophthalmoscope (AOSLO) and a dual-Purkinje image (dPi) eye tracker in order to cross-validate these two methods of recording miniature eye movement. Method: Foveas of three subjects were imaged with a one degree square scan using the Houston AOSLO, at a resolution of 8 pixels per arc minute. A Generation V dPi tracker from SRI was placed in front of the AOSLO, and eye movements were recorded at the same time from the same eye being imaged by the AOSLO. AOSLO movies were analyzed off line to extract retinal image motion. The resulting traces were then overlaid on the dPi recordings for comparison. Results: The two methods produced records that agreed to within about one arc minute, with more significant disagreements occurring after eye blinks. Microsaccades in the dPi record were accompanied by overshoots that have previously been associated with lens wobble. AOSLO traces also showed saccade-related overshoots, but of much smaller amplitude. Conclusions: Eye movement recordings measured with dual Purkinje image trackers predict retinal image motion to a precision of about 1 arc minute, except for 10–20 milliseconds following each saccade and 500 – 1000 milliseconds following each eye blink. Retinal image motion measured directly from AOSLO recordings can be recovered to a precision of just a few arc seconds.

  • Research Article
  • Cite Count Icon 45
  • 10.1097/00006324-200011000-00006
Perception of a clear and stable visual world with congenital nystagmus.
  • Nov 1, 2000
  • Optometry and Vision Science
  • Harold E Bedell

Comparisons between the visual performance of persons with congenital nystagmus (CN) and normal observers under conditions of similar retinal image motion reveal the extent to which the nystagmus-induced image motion determines visual functioning and perception in CN. Visual acuity undergoes similar changes with the characteristics of retinal image motion in normal observers and persons with CN. However, acuity is poorer than expected on the basis of the image motion in some individuals with CN, suggesting an additional sensory deficit. When presented with visual stimuli that simulate the retinal image motion in CN, normal observers perceive substantial target movement and motion smear. In contrast, most individuals with CN perceive the visual world to be stable and relatively clear. These dramatic perceptual differences are attributed primarily to the visual consequences of extra-retinal signals, which have been shown to accompany the involuntary eye movements in CN as well as the voluntary and involuntary eye movements in normal observers. Adaptation to periodic motion of the retinal image may also contribute to the perception of stability in persons with CN. The data presented in this paper indicate that, on the whole, largely similar visual mechanisms are likely to underlie visual functioning and mediate perception in persons with CN and normal vision.

  • Research Article
  • Cite Count Icon 45
  • 10.1016/s0376-6357(98)00007-2
Side-to-side head movements to obtain motion depth cues:: A short review of research on the praying mantis
  • Apr 1, 1998
  • Behavioural Processes
  • Karl Kral

Side-to-side head movements to obtain motion depth cues:: A short review of research on the praying mantis

  • Research Article
  • Cite Count Icon 12
  • 10.1523/jneurosci.3166-11.2011
Representation of Perceptually Invisible Image Motion in Extrastriate Visual Area MT of Macaque Monkeys
  • Nov 16, 2011
  • The Journal of Neuroscience
  • Sonja S Hohl + 1 more

Why does the world appear stable despite the visual motion induced by eye movements during fixation? We find that the answer must reside in how visual motion signals are interpreted by perception, because MT neurons in monkeys respond to the image motion caused by eye drifts in the presence of a stationary stimulus. Several features suggest a visual origin for the responses of MT neurons during fixation: spike-triggered averaging yields a peak image velocity in the preferred direction that precedes spikes by ∼60 ms; image velocity during fixation and firing rate show similar peaks in power at 4-5 Hz; and average MT firing during a period of fixation is related monotonically to the image speed along the preferred axis of the neurons 60 ms earlier. The percept caused by the responses of MT neurons during fixation depends on the distribution of activity across the population of neurons of different preferred speeds. For imposed stimulus motion, the population response peaks for neurons that prefer the actual target speed. For small image motions caused by eye drifts during fixation, the population response is large, but is noisy and does not show a clear peak. This representation of image motion in MT would be ignored if perception interprets the population response in the context of a prior of zero speed. Then, we would see a stable scene despite MT responses caused by eye drifts during fixation.

Save Icon
Up Arrow
Open/Close
  • Ask R Discovery Star icon
  • Chat PDF Star icon

AI summaries and top papers from 250M+ research sources.