Abstract

We develop a fast and accurate approach to approximate posterior distributions in the Bayesian empirical likelihood framework. Bayesian empirical likelihood allows for the use of Bayesian shrinkage without specification of a full likelihood but is notorious for leading to several computational difficulties. By coupling the stochastic variational Bayes procedure with an adjusted empirical likelihood framework, the proposed method overcomes the intractability of both the exact posterior and the arising evidence lower bound objective, and the mismatch between the exact posterior support and the variational posterior support. The optimization algorithm achieves fast algorithmic convergence by using the variational expected gradient of the log adjusted empirical likelihood function. We prove the consistency of the proposed approximate posterior distribution and an empirical likelihood analogue of the variational Bernstein-von-Mises theorem. Through several numerical examples, we confirm the accuracy and quick algorithmic convergence of our proposed method. Supplementary materials for this article are available online.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call