Hearing is confronted by a similar problem to vision as the observer moves. Movement of the sensors creates image motion that remains ambiguous until the observer knows the velocity of eye and head. The visual system solves this problem using motor commands, proprioception, and vestibular information (so-called “extra-retinal signals”), but the solution is not always perfect. Here, we compare the auditory errors made during head rotation with the visual mistakes made during eye movement. Real-time measurements of head velocity were used to change the gain relating head movement to source movement across a loudspeaker array. The gain at which “extra-cochlear signals” (encoding head rotation) was perceptually matched to “acoustic signals” (encoding source motion across the ears), thus yielding the perception of a stationary source, was small and positive. The gain varied depending on context, e.g., average source direction with respect to head. Two possible accounts of analogous findings in vision will be d...