Abstract

Maximum likelihood difference scaling was used to measure suprathreshold contrast response difference scales for low-frequency Gabor patterns, modulated along luminance and L-M color directions in normal, protanomalous, and deuteranomalous observers. Based on a signal-detection model, perceptual scale values, parameterized as $ d^\prime $d', were estimated by maximum likelihood. The difference scales were well fit by a Michaelis-Menten model, permitting estimates of response and contrast gain parameters for each subject. Anomalous observers showed no significant differences in response or contrast gain from normal observers for luminance contrast. For chromatic modulation, however, anomalous observers displayed higher contrast and lower response gain compared to normal observers. These effects cannot be explained by simple pigment shift models, and they support a compensation mechanism to optimize the mapping of the input contrast range to the neural response range. A linear relation between response and contrast gain suggests a neural trade-off between them.

Highlights

  • Anomalous trichromacy is classically defined by abnormal shifts in the mixture of reddish and greenish primaries in metameric matches to a yellowish standard [1]

  • To assess shape changes that would depend on contrast gain, we examined the value of loge (c 0/(c 0 + ς )), which on the log contrast scale is the difference between the log contrasts at c 0 and the semi-saturation contrast [Fig. 4(b)]

  • We have demonstrated that maximum likelihood difference scaling (MLDS) is an effective method for obtaining estimates of the change in appearance of Gabor patterns over a contrast range not accessible with threshold measures of contrast sensitivity

Read more

Summary

Introduction

Anomalous trichromacy is classically defined by abnormal shifts in the mixture of reddish and greenish primaries in metameric matches to a yellowish standard [1]. Based on colorimetric [2,3,4,5] and genetic [6,7] studies, it is generally held that the change in matching behavior is explained by shifts in the peaks between the spectral sensitivities of the middle- (M-) or long- (L-) wavelength-sensitive cone photoreceptors compared to normal ones [Fig. 1(a)]. While there is variation in peak separations for both normal [12] and anomalous observers [11], the loss of chromatic sensitivity for average anomalous observers can be visualized by plotting the difference in the two long-wavelength spectral sensitivities, as shown in Fig. 1(b) [8]. The long-wavelength chromatic difference signal is reduced in anomalous observers with respect to the normal curve. The peak-to-trough difference of the protanomalous curve is 41% of the normal, and that of the deuteranomalous, 25% of the normal

Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call