Abstract

In this paper, a simple and robust variable step-size normalized LMS (VSS-NLMS) adaptive algorithm is proposed. The NLMS algorithm with a fixed step-size usually results in a trade-off between the residual error and the convergence speed of the algorithm. The variable step-size NLMS algorithm presented here will eliminate much of this trade-off. The step-size variation makes it possible for the VSS-NLMS algorithm to converge faster and to a lower steady-state error than in the fixed step-size case. We derive here the proposed algorithm and analyze its steady-state performance. Computer simulation shows that the analytical results obtained in this paper are closely verified. In particular, our simulation results show that the proposed VSS-NLMS algorithm outperforms the traditional NLMS algorithm both in terms of convergence speed and steady-state error.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.