Abstract

The convergence rate of the diffusion normalized least mean squares (NLMS) algorithm can be speeded up by decorrelation of signals. However, the diffusion decorrelation NLMS algorithm confronts the conflicting requirements of fast convergence rate and small steady-state error. To address the issue, this paper proposes a family of diffusion Bayesian decorrelation least mean squares (DBDLMS) algorithms based on decorrelated observation models. Firstly, the weight update equations of the proposed DBDLMS algorithms are obtained by performing Bayesian inference over the decorrelated observation models, with variable step-sizes emerging naturally to help address the conflicting requirement. Secondly, the update equations of the decorrelation coefficient vectors are inferred from the Bayesian perspective, with the variable step-sizes emerging again. Subsequently, the performance analysis of the proposed DBDLMS algorithms is carried out. Moreover, a simple and effective approach is derived to estimate the free parameters for the proposed DBDLMS algorithms. Finally, the learning performance of the proposed algorithms are verified by Monte Carlo simulations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call