Abstract

The Bayesian KL-optimality criterion is useful for discriminating between any two statistical models in the presence of prior information. If the rival models are not nested then, depending on which model is true, two different Kullback–Leibler distances may be defined. The Bayesian KL-optimality criterion is a convex combination of the expected values of these two possible Kullback–Leibler distances between the competing models. These expectations are taken over the prior distributions of the parameters and the weights of the convex combination are given by the prior probabilities of the models. Concavity of the Bayesian KL-optimality criterion is proved, thus classical results of Optimal Design Theory can be applied. A standardized version of the proposed criterion is also given in order to take into account possible different magnitudes of the two Kullback–Leibler distances. Some illustrative examples are provided.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call