Abstract
We investigate the performance of a class of particle filters (PFs) that can automatically tune their computational complexity by evaluating online certain predictive statistics which are invariant for a broad class of state-space models. To be specific, we propose a family of block-adaptive PFs based on the methodology of Elvira et al. (IEEE Trans Signal Process 65(7):1781–1794, 2017). In this class of algorithms, the number of Monte Carlo samples (known as particles) is adjusted periodically, and we prove that the theoretical error bounds of the PF actually adapt to the updates in the number of particles. The evaluation of the predictive statistics that lies at the core of the methodology is done by generating fictitious observations, i.e., particles in the observation space. We study, both analytically and numerically, the impact of the number K of these particles on the performance of the algorithm. In particular, we prove that if the predictive statistics with K fictitious observations converged exactly, then the particle approximation of the filtering distribution would match the first K elements in a series of moments of the true filter. This result can be understood as a converse to some convergence theorems for PFs. From this analysis, we deduce an alternative predictive statistic that can be computed (for some models) without sampling any fictitious observations at all. Finally, we conduct an extensive simulation study that illustrates the theoretical results and provides further insights into the complexity, performance and behavior of the new class of algorithms.
Highlights
We have provided new methodological, theoretical and numerical results on the performance of particle filtering algorithms with an adaptive number of particles
Decisions on whether to increase or decrease the computational effort are automatically made based on predictive statistics which are computed by generating fictitious observations, i.e., particles in the observation space
This result, which does not follow from classical convergence theorems for Monte Carlo filters, implies that one can effectively tune the performance of the particle filters (PFs) by adapting the computational effort. (b) Convergence of the predictive statistics used for making decisions on the adaptation of the computational effort implies convergence of the PF itself
Summary
There are many problems that are studied by way of dynamic probabilistic models Some of these models describe mathematically the evolution of hidden states and their relations with observations, which are sequentially acquired. A methodology that has gained considerable popularity in the last two and a half decades is particle filtering ( known as sequential Monte Carlo) This is a Monte Carlo methodology that approximates the distributions of interest by means of random (weighted) samples. A key parameter of particle filters (PFs) is the number of generated Monte Carlos samples (usually termed particles). It is impossible to know a priori the appropriate number of particles to achieve a prescribed accu-
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.