Abstract

This paper establishes the asymptotic consistency of the loss‐calibrated variational Bayes (LCVB) method. LCVB is a method for approximately computing Bayesian posterior approximations in a “loss aware” manner. This methodology is also highly relevant in general data‐driven decision‐making contexts. Here, we establish the asymptotic consistency of both the loss‐ calibrated approximate posterior and the resulting decision rules. We also establish the asymptotic consistency of decision rules obtained from a “naive” two‐stage procedure that first computes a standard variational Bayes approximation and then uses this in the decision‐making procedure.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call