The present study investigates the interaction between ideal binary masking (IdBM) noise reduction and interleaved processing for bilateral cochlear implant (CI) users and normal-hearing (NH) listeners attending to vocoder simulations. IdBM decomposes a signal into time-frequency (T-F) bins and retains only regions where the target is higher than a local threshold (LC). IdBM benefits in CI users have been found to be limited due to factors such as current spread from neighboring electrodes. Interleaving channels across ears in bilateral CI users has been one approach to mitigating the effects of current spread. In the present experiments we tested NH listeners attending to vocoder simulations and CI users with IEEE sentences presented from different azimuths in speech-shaped noise at 5 dB signal-to-noise ratio and IdBM processing with LC values of 5 and -10 dB. Speech intelligibility was poorer for the interleaved condition in the presence of noise. As predicted, however, greater IdBM benefits were found for interleaved compared to non-interleaved processing for both NH and CI listeners. For CI users, the LC value of 5 dB led to greater improvements whereas the NH listeners had comparable benefits for both LC values. Findings have relevance for the clinical application of interleaved processing.