Abstract

Variational Bayesian (VB) methods produce posterior inference in a time frame considerably smaller than traditional Markov Chain Monte Carlo approaches. Although the VB posterior is an approximation, it has been shown to produce good parameter estimates and predicted values when a rich classes of approximating distributions are considered. In this paper, we propose the use of recursive algorithms to update a sequence of VB posterior approximations in an online, time series setting, with the computation of each posterior update requiring only the data observed since the previous update. We show how importance sampling can be incorporated into online variational inference allowing the user to trade accuracy for a substantial increase in computational speed. The proposed methods and their properties are detailed in two separate simulation studies. Additionally, two empirical illustrations are provided, including one where a Dirichlet Process Mixture model with a novel posterior dependence structure is repeatedly updated in the context of predicting the future behaviour of vehicles on a stretch of the US Highway 101.

Highlights

  • Time series data often arrives in high frequency streams in applications that may require a response within a very short period of time

  • This paper proposes a framework to extend the use of Stochastic Variational Bayes (SVB) inference to a sequential posterior updating setting

  • Updating Variational Bayes (UVB) is a variational analogue to exact Bayesian updating, where the previous posterior distribution, taken as an updated prior, is replaced with an approximation itself derived from an earlier SVB approximation

Read more

Summary

Introduction

Time series data often arrives in high frequency streams in applications that may require a response within a very short period of time. Any data observed from the data generating process may be used within the MFVB coordinate descent algorithm, which is applied online with newly observed data substituted in as it becomes available Each of these approaches results in only a single posterior distribution conditioned on data up to some pre-specified time period T1, and do not provide a mechanism for the approximation to be updated at a later time period T2 following the availability of additional observations. Smidl (2004) and Broderick et al (2013) each consider VB approximations for Bayesian updating, resulting in a progressive sequence of approximate posterior distributions that each condition on data up to any given time period Tn Their approaches update to the time Tn+1 by substitution of the time Tn posterior with MFVB approximations, which are feasibly obtained due to assuming the model and approximation each adheres to a suitably defined exponential family form. UVB is applied to a vehicle DPM model in Section 7, and Section 8 concludes the paper

Background on Variational Bayes
Updating Variational Bayes
UVB with Importance Sampling
Simulation Study
Time Series Forecasting
Mixture Model Clustering
Eight Schools Example
Lane Position Example
A Hierarchical Model
Implementation of SVB at time T1
Iterating UVB
Predicting Lane Positions
Result
Analysis of the NGSIM Data
Conclusions
A Calculation of Lateral Lane Deviation
B Equivalence of augmented and marginal KL Divergence gradients
Findings
C Mean Field Variational Bayes implementation of the Dirichlet Process Mixture

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.