Abstract

In this paper, we address the robustness, in the sense of l2-stability, of the set-membership normalized least-mean-square (SM-NLMS) and the set-membership affine projection (SM-AP) algorithms. For the SM-NLMS algorithm, we demonstrate that it is robust regardless of the choice of its parameters and that the SM-NLMS enhances the parameter estimation in most of the iterations in which an update occurs, two advantages over the classical NLMS algorithm. Moreover, we also prove that if the noise bound is known, then we can set the SM-NLMS so that it never degrades the estimate. As for the SM-AP algorithm, we demonstrate that its robustness depends on a judicious choice of one of its parameters: the constraint vector (CV). We prove the existence of CVs satisfying the robustness condition, but practical choices remain unknown. We also demonstrate that both the SM-AP and SM-NLMS algorithms do not diverge, even when their parameters are selected naively, provided the additional noise is bounded. Numerical results that corroborate our analyses are presented.

Highlights

  • The classical adaptive filtering algorithms are iterative estimation methods based on the point estimation theory [1]

  • We address the robustness of the set-membership normalized least-meansquare (SM-normalized least-mean-square (NLMS)) algorithm for the cases of unknown noise bound and known noise bound in subsections 3.3 and 3.4, respectively

  • In addition to the already known advantages of the SM-NLMS algorithm over the NLMS algorithm, regarding accuracy and computational cost, in this paper we demonstrated that: (i) the SM-NLMS algorithm is robust regardless the choice of its parameters and (ii) the SMNLMS algorithm uses the input data very efficiently, i.e., it rarely produces a worse estimate w(k + 1) during its update process

Read more

Summary

Introduction

The classical adaptive filtering algorithms are iterative estimation methods based on the point estimation theory [1]. Two important SM algorithms are the set-membership NLMS (SM-NLMS) and the setmembership AP (SM-AP) algorithms, proposed in [8, 9], respectively These algorithms keep the advantages of their classical counterparts, but they are more accurate, more robust against noise, and reduce the computational complexities due to the data selection strategy previously explained [2, 10,11,12]. The robustness of the SM-NLMS algorithm is studied, where we discuss the cases in which the noise bound is assumed known and unknown. The criterion given in (3) requires that we adjust estimates {yk|k} such that the ratio of the estimation-error energy (numerator) to the energy of the uncertainties (denominator) does not exceed η2. The parameter δ ∈ R+ is a regularization factor, generally adopted as a small constant, used to avoid divisions by 0

Robustness of the SM-NLMS algorithm
Robustness of the SM-AP algorithm
Confirming the results for the SM-AP algorithm
Conclusions

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.