Abstract

A major goal of systems neuroscience is to describe neural representations of the environment that mediate perception. An experimental strategy adopted by researchers involved in this effort is to obtain three-dimensional correlations between the physical properties of a stimulus, a psychophysical report of the subject, and measures of intervening neural events. Perception is, of course, a private event and objective indirect measures of the percept are obtained. Experiments related to the perceived location of a stimulus often ask subjects to point to the stimulus or to orient to it with an eye movement. Studies using orienting eye movements as the behavioral index of perceived location have become more common now that precise and accurate methods of measuring eye position are readily available. In this issue, Tollin et al. (p. 1223–1234) report that orienting movements of the eyes in head-restrained cats significantly underestimate the actual ability of the animals to localize auditory and visual targets. The target paper compares two commonly used methods of measuring the accuracy and precision of the localization of visual and auditory targets in the cat: eye saccades in animals with their heads restrained and combined eye-head movements (gaze saccades) in animals without head restraint. The effects of stimulus duration were also examined. The shortest-duration stimuli were no longer present when the orienting movement began, and therefore the change in gaze position may have been based on a stored representation of the spatial location of the target. With long-duration stimuli, the stimuli were still present after the orienting movements began and dynamic sensory processing could influence movement accuracy. Target location varied in both azimuth and elevation. Because independent neural pathways that depend on different spatial cues are used to compute the azimuth and elevation of acoustic stimuli (see Zwiers et al. 2003 for a summary of relevant data), the effects of head restraint on localization were analyzed separately for azimuth and elevation. Localization accuracy of both visual and auditory targets improved when measured using a combination of eye and head movements. The effects were not small. When brief stimuli were presented 9 and 18° along the azimuth from the central fixation target, head-restrained animals mislocalized visual and auditory targets by an average of 6.73 and 7.24°, respectively. When the head was not restrained, these values dropped to 1.21 and 0.15°. Improvement was greatest for the shortest-duration stimuli, not the longer duration stimuli that allowed dynamic processing during the orienting move

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call