Abstract

Auditory motion perception remains poorly understood, in contrast with its well-established visual counterpart. Visual smooth pursuit (SP), a velocity-specific behavior, has been well-quantified during both head-fixed (ocular SP) and head-free (eye + head, or gaze SP) conditions. In contrast, auditory SP tracking has received little attention, despite its potential for demonstrating a motion-specific process in audition. We presented constant-velocity (10–40°/s), free-field auditory (0.2–20 kHz white noise), and visual (LED) targets to head-fixed or head-free subjects while recording ocular and gaze SP responses, respectively. To control for possible SP in the absence of a target, subjects were asked to recreate auditory trajectories after priming with a set of auditory ramps (practiced SP). We found that ocular auditory SP is consistently higher in gain than practiced SP, but variable and lower than visual SP. Further, the gain of auditory gaze SP exceeds ocular auditory SP. Finally, SP of periodic (triangular) motion trajectories revealed that auditory, like visual, SP improves rapidly over time, indicating predictive behavior. In sum, auditory SP closely mimics visual SP but with reduced gain. We propose that auditory motion processing exists, is robust, and recruits a velocity-dependent neural process shared with vision.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call