In perceiving color the eye performs a wavelength discrimination process which is analogous to the angular discrimination performed in a tracking radar. There are two basic principles for achieving angular discrimination: 1) multiple detectors with different angular response characteristics and 2) a single detector which scans its response characteristic. Up to now only the multiple-detector approach has been applied to explain the phenomenon of color vision. This paper postulates that the eye employs the scanning discrimination principles to perceive color. A wavelength-dependent effect within the cone causes light of different wavelengths to produce different spatial distributions of energy in the photodetector region. An electrical process scans across this photodetector region producing a modulated waveform which defines the color information. The dc value of the waveform gives the white information, the first harmonic gives the blue-yellow information and the second harmonic gives the green-red information. The phase determines the difference between blue and yellow and between green and red The waveform is demodulated in the retina to generate separate dc voltages which produce the white-black, blue-yellow and greenred sensations.