Abstract

Variational Bayesian inference with a Gaussian posterior approximation provides an alternative to the more commonly employed factorization approach and enlarges the range of tractable distributions. In this paper, we propose an extension to the Gaussian approach which uses Gaussian mixtures as approximations. A general problem for variational inference with mixtures is posed by the calculation of the entropy term in the Kullback-Leibler distance, which becomes analytically intractable. We deal with this problem by using a simple lower bound for the entropy and imposing restrictions on the form of the Gaussian covariance matrix. In this way, efficient numerical calculations become possible. To illustrate the method, we discuss its application to an isotropic generalized normal target density, a non-Gaussian state space model, and the Bayesian lasso. For heavy-tailed distributions, the examples show that the mixture approach indeed leads to improved approximations in the sense of a reduced Kullback-Leibler distance. From a more practical point of view, mixtures can improve estimates of posterior marginal variances. Furthermore, they provide an initial estimate of posterior skewness which is not possible with single Gaussians. We also discuss general sufficient conditions under which mixtures are guaranteed to provide improvements over single-component approximations.

Highlights

  • The Variational Bayes (VB) method has attracted growing interest as an alternative to Monte-Carlo integration in computational Bayesian inference (for reviews from the perspectives of different fields see, e.g., Opper and Saad (2001), Smidl and Quinn (2005), Bishop (2006), Wainwright and Jordan (2008), Ormerod and Wand (2010))

  • We focus on the VB variances which might be considered the most interesting quantities in this context from the point of view of Bayesian inference

  • The purpose of the present paper is to study the use of Gaussian mixture distributions as trial functions

Read more

Summary

Introduction

The Variational Bayes (VB) method has attracted growing interest as an alternative to Monte-Carlo integration in computational Bayesian inference (for reviews from the perspectives of different fields see, e.g., Opper and Saad (2001), Smidl and Quinn (2005), Bishop (2006), Wainwright and Jordan (2008), Ormerod and Wand (2010)). This approach is mainly applied to “conjugate-exponential” Bayesian target distributions p, for which the data likelihood is in the exponential family and priors are chosen as conjugate In this case, the optimized VB factors turn out to belong to the same classes of distributions as the priors, and the minimization problem (1) can be solved efficiently by means of an iteration scheme that updates a finite vector of parameters. The computation of the entropy is dealt with by making use of an alternative lower-bound approximation and restricting the choice of covariance matrices for the mixture components In this way, efficient numerical calculations become feasible.

Variational approximations with mixtures
Entropy of a mixture distribution
General approach
Example
Sufficient conditions for improvement of VB approximations by mixtures
A non-Gaussian state-space model
The Bayesian lasso
Findings
Summary and conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.