The interaural time difference (ITD) is a primary horizontal-plane sound localization cue computed in the auditory brainstem. ITDs are accessible in the temporal fine structure of pure tones with a frequency of no higher than about 1400Hz. How listeners' ITD sensitivity transitions from very best sensitivity near 700Hz to impossible to detect within 1 octave currently lacks a fully compelling physiological explanation. Here, it was hypothesized that the rapid decline in ITD sensitivity is dictated not by a central neural limitation but by initial peripheral sound encoding, specifically, the low-frequency (apical) portion of the cochlear excitation pattern produced by a pure tone. ITD sensitivity was measured in 16 normal-hearing listeners as a joint function of frequency (900-1500Hz) and level (10-50dB sensation level). Performance decreased with increasing frequency and decreasing sound level. The slope of performance decline was 90dB/octave, consistent with the low-frequency slope of the cochlear excitation pattern. Fine-structure ITD sensitivity near 1400Hz may be conveyed primarily by "off-frequency" activation of neurons tuned to lower frequencies near 700Hz. Physiologically, this could be realized by having neurons sensitive to fine-structure ITD up to only about 700Hz. A more extreme model would have only a single narrow channel near 700Hz that conveys fine-structure ITDs. Such a model is a major simplification and departure from the classic formulation of the binaural display, which consists of a matrix of neurons tuned to a wide range of relevant frequencies and ITDs.
Read full abstract