Abstract
Nonlinear state-space models are powerful tools to describe dynamical structures in complex time series. In a streaming setting where data are processed one sample at a time, simultaneous inference of the state and its nonlinear dynamics has posed significant challenges in practice. We develop a novel online learning framework, leveraging variational inference and sequential Monte Carlo, which enables flexible and accurate Bayesian joint filtering. Our method provides an approximation of the filtering posterior which can be made arbitrarily close to the true filtering distribution for a wide class of dynamics models and observation models. Specifically, the proposed framework can efficiently approximate a posterior over the dynamics using sparse Gaussian processes, allowing for an interpretable model of the latent dynamics. Constant time complexity per sample makes our approach amenable to online learning scenarios and suitable for real-time applications.
Highlights
Nonlinear state-space models are generative models for complex time series with underlying nonlinear dynamical structure [1], [2], [3]
In this study we developed a novel online learning framework, leveraging variational inference and sequential Monte Carlo, which enables flexible and accurate Bayesian joint filtering
Our derivation shows that our filtering posterior can be made arbitrarily close to the true one for a wide class of dynamics models and observation models
Summary
Nonlinear state-space models are generative models for complex time series with underlying nonlinear dynamical structure [1], [2], [3]. We are interested in online algorithms that can recursively solve the dual estimation problem of learning both the latent trajectory, x1:t, in the state-space and the parameters of the model, {θ, ψ}, from streaming observations [12]. They usually provide coarse approximations to the filtering distribution and involve many hyperparameters to be tuned which hinder their practical performance They do not take advantage of modern stochastic gradient optimization techniques commonly used throughout machine learning and are not applicable to arbitrary observation likelihoods. We propose a novel sequential Monte Carlo method for inferring a state-space model for the streaming time series scenario that adapts the proposal distribution on-the-fly by optimizing a surrogate lower bound to the log normalizer of the filtering distribution.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE transactions on pattern analysis and machine intelligence
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.