Abstract

A new method has been developed to approximate one Gaussian mixture by another in a process that generalizes the idea of importance re-sampling in a particle filter. This algorithm is being developed as part of an effort to generalize the concept of a particle filter. In a traditional particle filter, the underlying probability density function is described by particles: Dirac delta functions with infinitesimal covariances. This paper develops an important component of a “blob” filter, which uses a Gaussian mixture of “fattened,” finitecovariance blobs instead of infinitesimal particles. The goal of a blob filter is to save computational effort for a given level of probability density precision by using many fewer blobs than particles. Most of the techniques necessary for this type of filter have already been developed. The one missing component is developed in this paper: a re-sampling algorithm that bounds the covariance of each element while accurately re-producing the original probability distribution. The covariance bounds are needed in order to keep the blobs from becoming too “fat”; otherwise, Extended Kalman Filter (EKF) or Unscented Kalman Filter dynamic propagation and measurement update calculations would cause excessive truncation error for each blob. The re-sampling algorithm is described in detail, and its performance is studied using several simulated test cases. Also discussed is the usefulness of a Gaussian mixture and EKF-like techniques for nonlinear dynamic propagation and nonlinear measurement update of probability distributions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call