Abstract

Abstract The deviance information criterion (DIC) was introduced in 2002 by Spiegelhalter et al . to compare the relative fit of a set of Bayesian hierarchical models. It is similar to Akaike's information criterion (AIC) in combining a measure of goodness‐of‐fit and measure of complexity, both based on the deviance. While AIC uses the maximum likelihood estimate, DIC's plug‐in estimate is based on the posterior mean. As the number of independent parameters in a Bayesian hierarchical model is not clearly defined, DIC estimates the effective number of parameters by the difference of the posterior mean of the deviance and the deviance at the posterior mean. This coincides with the number of independent parameters in fixed effect models with flat priors, thus the DIC is a generalization of AIC. It can be justified as an estimate of the posterior predictive model performance within a decision‐theoretic framework and it is asymptotically equivalent to leave‐one‐out cross‐validation. The DIC has been used extensively for practical model comparison in many disciplines and works well for exponential family models but due to its dependence on the parametrization and focus of a model, its application to mixture models is problematic.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call