Abstract

A crucial problem in Bayesian posterior computation is efficient sampling from a univariate distribution, e.g. a full conditional distribution in applications of the Gibbs sampler. This full conditional distribution is usually non-conjugate, algebraically complex and computationally expensive to evaluate. We propose an alternative algorithm, called ARMS2, to the widely used adaptive rejection sampling technique ARS [Gilks, W.R., Wild, P., 1992. Adaptive rejection sampling for Gibbs sampling. Applied Statistics 41 (2), 337–348; Gilks, W.R., 1992. Derivative-free adaptive rejection sampling for Gibbs sampling. In: Bernardo, J.M., Berger, J.O., Dawid, A.P., Smith, A.F.M. (Eds.), Bayesian Statistics, Vol. 4. Clarendon, Oxford, pp. 641–649] for generating a sample from univariate log-concave densities. Whereas ARS is based on sampling from piecewise exponentials, the new algorithm uses truncated normal distributions and makes use of a clever auxiliary variable technique [Damien, P., Walker, S.G., 2001. Sampling truncated normal, beta, and gamma densities. Journal of Computational and Graphical Statistics 10 (2) 206–215]. Furthermore, we extend this algorithm to deal with non-log-concave densities to provide an enhanced alternative to adaptive rejection Metropolis sampling, ARMS [Gilks, W.R., Best, N.G., Tan, K.K.C., 1995. Adaptive rejection Metropolis sampling within Gibbs sampling. Applied Statistics 44, 455–472]. The performance of ARMS and ARMS2 is compared in simulations of standard univariate distributions as well as in Gibbs sampling of a Bayesian hierarchical state-space model used for fisheries stock assessment.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call