Abstract

This chapter explains why visual motion perception is not just perception of the changing positions of moving objects. Computationally complementary processes process static objects with different orientations, and moving objects with different motion directions, via parallel cortical form and motion streams through V2 and MT. The motion stream pools multiple oriented object contours to estimate object motion direction. Such pooling coarsens estimates of object depth, which require precise matches of oriented stimuli from both eyes. Negative aftereffects of form and motion stimuli illustrate these complementary properties. Feature tracking signals begin to overcome directional ambiguities due to the aperture problem. Motion capture by short-range and long-range directional filters, together with competitive interactions, process feature tracking and ambiguous motion directional signals to generate a coherent representation of object motion direction and speed. Many properties of motion perception are explained, notably barberpole illusion and properties of long-range apparent motion, including how apparent motion speed varies with flash interstimulus interval, distance, and luminance; apparent motion of illusory contours; phi and beta motion; split motion; gamma motion; Ternus motion; Korte’s Laws; line motion illusion; induced motion; motion transparency; chopsticks illusion; Johannson motion; and Duncker motion. Gaussian waves of apparent motion clarify how tracking occurs, and explain spatial attention shifts through time. This motion processor helps to quantitatively simulate neurophysiological data about motion-based decision-making in monkeys when it inputs to a model of how the lateral intraparietal, or LIP, area chooses a movement direction from the motion direction estimate. Bayesian decision-making models cannot explain these data.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call