Abstract

In this paper, we modify the GLMS algorithm to become the proposed LMS-SAS algorithm, which can more effectively converge to the global minimum of the mean-square output error (MSE) objective function. We also derive the infinite impulse response-normalized least-squares algorithm (IIR-NLMS), whose behavior is similar to the LMS-SAS algorithm. The GLMS algorithm achieves its global search capability by appending a random perturbing noise to the LMS algorithm. Similarly, we suggest that such perturbing noise is to be multiplied by its MSE objective function in the proposed LMS-SAS algorithm. For the NLMS algorithm, we use the gradient estimation error, which exists naturally in the adaptive process, to act as perturbing noise. We have shown, theoretically and experimentally, that the LMS-SAS and NLMS algorithm do converge to the global minimum.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call