Abstract

We provide a largely nonmathematical introduction to the using of probability in statistical modeling and parametric statistical inference by the two dominant estimation methods, maximum likelihood and Bayesian posterior inference. We explain and illustrate the simulation method concepts collectively known as Markov chain Monte Carlo (MCMC), which is how Bayesian inference is nearly always conducted nowadays. We introduce a family of modeling software, all of which use virtually the same model-definition language (known as BUGS) and then fit models using MCMC techniques, hence providing Bayesian posterior inference for arbitrarily complex statistical models. One of these programs is called JAGS and is the software of choice in this book. In this chapter, we introduce the use of JAGS for fitting a range of generalized linear and generalized linear mixed models. Throughout the chapter, we use simulated data to emphasize the great value of data simulation for your understanding of statistical methods and models. In addition, we fit most models with frequentist methods and compare the resulting maximum likelihood estimates with the Bayesian posterior estimates. We observe that with vague priors, Bayesian estimates are typically numerically similar to estimates using maximum likelihood. We also give an overview of the principles of data integration in a general integrated model that defines a joint likelihood for multiple disparate data sets. We illustrate these general principles with an integrated species distribution model that combines detection–nondetection and count data. We discuss whether one should either become exclusively Bayesian or perhaps remain exclusively frequentist and ultimately argue for an eclectic choice. We believe that both maximum likelihood estimation and Bayesian posterior inference have their place in applied statistics.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call