Abstract

This work presents a unified performance analysis of the family of normalized least mean (NLM) algorithms under very weak assumptions. The key feature of the analysis is based on a recently proposed performance measure called effective weight deviation vector. The name is so given as it is the only component that contributes to the excess estimation error of the adaptive filter. Using this novel concept, both the steady-state analysis and the tracking analysis are presented with a unified framework for the whole family of the NLM algorithms. Thus, the derived results are valid for any normalized stochastic gradient algorithm minimizing 2pth power of the error where p is an integer value. The novelty of the analysis resides in the fact that it does not impose restrictions on the dependence between input successive regressors, the dependence among input regressor elements, the length of the adaptive filter, the distribution of noise, and filter’s input. Moreover, this approach is not limited to only small step-size value, and therefore, the analysis is valid for all the values of the step size in the stable range of the NLM algorithms. Furthermore, in this analysis, both stationary and non-stationary input signal and plant scenarios are considered. Consequently, asymptotic time-averaged convergence for the mean-squared effective weight deviation, mean absolute excess estimation error, and the mean-square excess estimation error for the NLM algorithm are established for both constant and time-varying plants. Finally, a number of simulation results are carried out to corroborate the theoretical findings.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call