Abstract

Bayesian models of object recognition propose the resolution of ambiguity through probabilistic integration of prior experience with available sensory information. Color, even when task-irrelevant, has been shown to modulate high-level cognitive control tasks. However, it remains unclear how color modulations affect lower-level perceptual processing. We investigated whether color affects feature integration using the flash-jump illusion. This illusion occurs when an apparent motion stimulus, a rectangular bar appearing at different locations along a motion trajectory, changes color at a single position. Observers misperceive this color change as occurring farther along the trajectory of motion. This mislocalization error is proposed to be produced by a Bayesian perceptual framework dependent on responses in area V4. Our results demonstrated that the color of the flash modulated the magnitude of the flash-jump illusion such that participants reported less of a shift, i.e., a more veridical flash location, for both red and blue flashes, as compared to green and yellow. Our findings extend color-dependent modulation effects found in higher-order executive functions into lower-level Bayesian perceptual processes. Our results also support the theory that feature integration is a Bayesian process. In this framework, color modulations play an inherent and automatic role as different colors have different weights in Bayesian perceptual processing.

Highlights

  • Our ability to recognize and interact with objects relies heavily on visual perception

  • We focused on feature integration using the flash-jump effect, a visual illusion of color and motion integration first described by Cai and Schlag (2001)

  • We propose that the V4 mislocalization reported by Sundberg et al (2006) is produced by an earlier Bayesian framework for feature integration, where the motion extrapolation signal is combined with the color signal that is weighted differently depending on the color priors

Read more

Summary

Introduction

Our ability to recognize and interact with objects relies heavily on visual perception. Incomplete or noisy input, the human visual system is able to perceive objects and object properties with great accuracy. Helmholtz (1867) theorized that the visual system makes unconscious deductions or “inferences” about object and scene properties to resolve this ambiguity, resulting in accurate perception. Helmholtz’s theory of unconscious inference has been formalized into models of Bayesian perception. Bayesian models of visual perception suggest that the resolution of ambiguity occurs through probabilistic integration of prior experience or knowledge (priors) with available sensory information (likelihood), giving rise to a probability distribution of the object property in question. There is increased reliance on prior knowledge (Kersten et al, 2004)

Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call