Abstract

Bayesian multimodel inference (BMI) has a strong philosophical appeal; like Bayesian inference generally, it retains the features of simplicity, exactness, and coherency. BMI is a very natural extension of the basic Bayesian technique: one makes inference about unknown quantities (in this case, models) based on their posterior distributions, given data. Posterior model probabilities are used for combining model-specific estimates, and to combine model-specific inferences. As for model selection, if a single model is desired, posterior model probabilities provide an objective basis for choice. This chapter provides an overview of BMI, with comments on model weights, Bayes factors, the Bayesian information criterion (BIC), and the deviance information criterion (DIC). It also discusses computational issues, describing reversible jump Markov chain Monte Carlo (RJMCMC) and simple implementations of BMI in program BUGS. The Bayes factor provides a way of comparing pairs of competing models. The models need not be nested; neither need be a special case of the other. Some features of Bayes factors include they are a likelihood ratio for models, they update naturally, they are measures of relative support, and often use vague priors on parameters.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call