Abstract

The extended integration time of visual neurons can lead to the production of the neural equivalent of an orientation cue along the axis of motion in response to fast-moving objects. The dominant model argues that these motion streaks resolve the inherent directional uncertainty arising from the small size of receptive fields in V1, by combining spatial orientation with motion signals in V1. This model was tested in humans using visual aftereffects, in which adapting to a static grating causes the perceived direction of a subsequently presented motion stimulus to be tilted away from the adapting orientation. We found that a much broader range of orientations produced aftereffects than predicted by the current model, suggesting that these orientation cues influence motion perception at a later stage than V1. We also found that varying the spatial frequency of the adaptor changed the aftereffect from repulsive to attractive for motion-test but not form-test stimuli. Finally, manipulations of V1 excitability, using transcranial stimulation, reduced the aftereffect, suggesting that the orientation cue is dependent on V1. These results can be accounted for if the orientation information from the motion streak, gathered in V1, enters the motion system at a later stage of motion processing, most likely V5. A computational model of motion direction is presented incorporating gain modifications of broadly tuned motion-selective neurons by narrowly tuned orientation-selective cells in V1, which successfully accounts for the extant data. These results reinforce the suggestion that orientation places strong constraints on motion processing but in a previously undescribed manner.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call