Abstract

This paper presents a study of sequential parameter estimation based on a linear non-Gaussian observation model. To develop robust algorithms, we consider a family of heavy-tailed distributions that can be expressed as the scale mixture of Gaussian and extend the development to include some robust penalty functions. We treat the problem as a Bayesian learning problem and develop an iterative algorithm by using the Laplace approximation for the posterior and the minorization-maximization (MM) algorithm as an optimization tool. We then study a one-step implementation of the iterative algorithm. This leads to a family of generalized robust RLS-type of algorithms which include several well-known algorithms as special cases. Using a further simplification that the covariance is fixed, leads to a family of generalized robust LMS-type of algorithms. Through mathematical analysis and simulations, we demonstrate the robustness of these algorithms

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call