Abstract

Adaptive filtering has been widely used in many fields. However, the disadvantage of the fixed step-size LMS is that the step factor can not meet the convergence rate and the static error at the same time. This paper describes the research work to an improved variable step size Least Mean Square (LMS) adaptive filtering algorithm based on hyperbolic tangent function. The basic principles of some existing variable step LMS adaptive filter algorithms were analyzed firstly. Based on the hyperbolic tangent function, a novel variable step size LMS algorithm is proposed to increase the convergence rate and eliminate the disturbances of existing independent noise. The step size increases or decreases as the mean-square error increases or decreases, allowing the adaptive filter to track changes in the system as well as produce a small steady state error. The convergence and steady state behavior of the algorithm were analyzed. What's more, this algorithm eliminates the influence of independent noise by using the autocorrelation of the current error signal e(n) and the previous error signal e(n-1). The simulations were carried on to testify the effectiveness, and the MSE learning curves were got precisely The results of computer simulation confirm this improved algorithm has smaller computation, faster convergence rate, and lower static error to compare with the other variable step algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call