Abstract

This paper studies the convergence behaviors of the normalized least mean square (NLMS) and the normalized least mean M-estimate (NLMM) algorithms. Our analysis is obtained by extending the framework of Bershad [6], [7], which were previously reported for the NLMS algorithm with Gaussian inputs. Due to the difficulties in evaluating certain expectations involved, in [6], [7] the behaviors of the NLMS algorithm for general eigenvalue distributions of input autocorrelation matrix were not fully analyzed. In this paper, using an extension of Price's theorem to mixture Gaussian distributions and by introducing certain special integral functions, closed-form results of these expectations are obtained which allow us to interpret the convergence performance of both the NLMS and the NLMM algorithms in Contaminated Gaussian noise. The validity of the proposed analysis is verified through computer simulations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call