Abstract

Today, the field of practical estimation is dominated by linear-Bayesian estimators as these familiar techniques are computationally efficient and can be quickly tailored to many systems of interest. However, beyond the common weaknesses of linear estimators to non-Gaussian noises and nonlinear transformations, Bayes' rule itself lacks robustness, as it inherently assumes that all statistical models are exactly known. Therefore, in lieu of filters outfitted with linear-Bayesian updates, this work proposes a method to derive nonlinear, non-Bayesian updates from first principles using the theory of generalized variational inference (GVI)—an optimization approach to information fusion. By selecting different divergence measures, loss functions, and feasible distributions, a wide variety of robust/conservative filter updates are quickly built and tested using numerical optimization. From these GVI candidates, a suitable update that uses the robust γ-loss function is selected based upon its desirable estimation behavior. Via analytical optimization and Gaussian mixture modeling, the γ-loss GVI update is realized as a nonlinear, closed-form filter that is well-suited for practical estimation.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.