Abstract

Parallel computation is a fast growing computing environment in many areas including computational Bayesian statistics. However, most of the Bayesian parallel computing have been implemented through the sequential Monte Carlo method where model parameters are updated sequentially and it is suitable for some large-scale problems. This paper is the first to revive the use of adaptive griddy Gibbs (AGG) algorithm under the Markov chain Monte Carlo framework and how how to implement the AGG using the parallel computation. The parallel AGG is suitable for (i) small to medium-scale problems where the dimension of model parameter space is not very high, (ii) some or all model parameters are defined or known on a specific interval, and (iii) model likelihood is intractable. In addition, the parallel AGG is relatively easy to implement and code. A simulation study of three examples including a linear regression model with Student-t error, a nonlinear regression model, and a financial time series model, and an empirical study are illustrated to show the applicability of the AGG using the parallel computing environment.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call