Abstract

This chapter reviews statistical models and their analysis in WinBUGS. One is maximum likelihood and the other is Bayesian inference. Bayesian inference is based on the posterior distribution, which is a product of the likelihood (representing the information contained in the data) and the prior distribution (representing what is known about the parameters beforehand). Bayesian inference uses a fact of conditional probability, Bayes rule, to let the data update the prior state of knowledge to the posterior state of knowledge. Priors can be regarded both as an asset and as a liability in Bayesian inference. The results of a Bayesian analysis based on the posterior distribution are much more easily explained to the public owing to the more intuitive Bayesian definition of probability. Bayesian analysis in practice nowadays means obtaining samples from the posterior distribution by simulation techniques such as Markov chain Monte Carlo.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call