Abstract
A common method for assessing validity of Bayesian sampling or approximate inference methods makes use of simulated data replicates for parameters drawn from the prior. Under continuity assumptions, quantiles of functions of the simulated parameter values for corresponding posterior distributions are uniformly distributed. Checking for uniformity when a posterior density is approximated numerically provides a diagnostic for algorithm validity. Furthermore, adjustments to achieve uniformity can improve the quality of approximate inference methods. The present article develops a moment-based alternative to the conventional checking and adjustment methods using quantiles. The new approach relates prior and posterior expectations and covariances through the tower property of conditional expectation and the law of total variance. For adjustment, approximate inferences are modified so that the correct prior to posterior relationships hold. We illustrate the method in three examples. The first uses an auxiliary model in a likelihood-free inference problem. The second considers corrections for variational Bayes approximations in a deep neural network generalized linear mixed model. Our final application considers a deep neural network surrogate for approximating Gaussian process regression predictive inference. Supplementary files for this article are available online.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.