Abstract

In this article, we introduce Kullback-Leibler (K-L) divergence as a performance measure of marginal posterior density estimation. We show that the K-L divergence can be used to compare two density estimators as well as to assess convergence of a marginal density estimator. We also examine performance of the importance-weighted marginal density estimation (IWMDE) proposed by Chen (1994) under the K-L divergence and we further extend the IWMDE to some more complex Bayesian models where the kernel method, which is widely used for estimating marginal densities using Markov chain Monte Carlo (MCMC) sampling outputs is not applicable. Finally, we use a constrained linear multiple regression model as an example to illustrate our methodology.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call