Birds and mammals exploit interaural time differences (ITDs) for sound localization. Subsequent to ITD detection by brainstem neurons, ITD processing continues in parallel midbrain and forebrain pathways. In the barn owl, both ITD detection and processing in the midbrain are specialized to extract ITDs independent of frequency, which amounts to a pure time delay representation. Recent results have elucidated different mechanisms of ITD detection in mammals, which lead to a representation of small ITDs in high-frequency channels and large ITDs in low-frequency channels, resembling a phase delay representation. However, the detection mechanism does not prevent a change in ITD representation at higher processing stages. Here we analyze ITD tuning across frequency channels with pure tone and noise stimuli in neurons of the barn owl's auditory arcopallium, a nucleus at the endpoint of the forebrain pathway. To extend the analysis of ITD representation across frequency bands to a large neural population, we employed Fourier analysis for the spectral decomposition of ITD curves recorded with noise stimuli. This method was validated using physiological as well as model data. We found that low frequencies convey sensitivity to large ITDs, whereas high frequencies convey sensitivity to small ITDs. Moreover, different linear phase frequency regimes in the high-frequency and low-frequency ranges suggested an independent convergence of inputs from these frequency channels. Our results are consistent with ITD being remodeled toward a phase delay representation along the forebrain pathway. This indicates that sensory representations may undergo substantial reorganization, presumably in relation to specific behavioral output.