It is well known that head movements are instrumental in resolving front/back confusions in human sound localization. A mechanism for a binaural model is proposed here to extent current cross-correlation models to compensate for head movements. The algorithm tracks sound sources in the head-related coordinate system (HRCS) as well as in the room-related coordinate system (RRCS). It is also aware of the current head position within the room. The sounds are positioned in space using an HRTF catalog at 1 deg azimuthal resolution. The position of the sound source is determined through the inter-aural cross-correlation (IACC) functions across several auditory bands, which are mapped to functions of azimuth and superposed. The maxima of the cross-correlation functions determine the position of the sound source, but unfortunately, usually two peaks occur—one at or near the correct location and the second one at the front/back reversed position. When the model is programed to virtually turn its head, the degree-based cross-correlation functions are shifted with current head angle to match the RRCS. During this procedure, the IACC peak for the correct hemisphere will prevail if integrated over time for the duration of the head movement, whereas the front/back reversed peak will average out.