Abstract

The presence of noise is a pervasive issue that significantly impacts the performance of adaptive filtering algorithms. To address this challenge, the bias compensation technique has recently emerged, involving the incorporation of an additional term into the update equation. Despite the apparent simplicity of the bias-compensated least-mean-squares algorithm, conducting a comprehensive theoretical analysis of its performance is a complex task. In this paper, we undertake an exact expectation analysis to demonstrate the asymptotic unbiasedness of the algorithm, even when disregarding the commonly assumed theory of independence between adaptive coefficients and input data. Furthermore, to improve the understanding of the algorithm's stability, we employ a stochastic model that assumes independence between the radial and angular distributions of the input vector. The resulting model is intricate, necessitating heuristic approximations to derive practical insights. Notably, our analysis reveals that the upper bound on the step size of the bias-compensated least-mean-squares algorithm consistently remains smaller than that of the least-mean-squares algorithm. These findings gain robust support from extensive simulations.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.