The application of dyadic wavelet decomposition in the context of time delay estimation is investigated. We consider a model in which the source signal is deterministic and the received sensor outputs are corrupted by additive noises. Wavelet denoising is exploited to provide an effective solution for the problem. Denoising is first applied to preprocess the received signals from two spatially separated sensors with an attempt to remove the contamination, and the peak of their cross correlation function is then located from which the time delay between the two signals can be derived. A novel wavelet shrinkage/thresholding technique for denoising is introduced, and the performance of the algorithm is analyzed rigorously. It is proved that the proposed method achieves global convergence with a high probability. Simulation results also corroborate that the technique is efficient and performs significantly better than both the generalized cross correlator (GCC) and the direct cross correlator (CC).