Abstract

AbstractThe problem of sampling constrained continuous distributions has frequently appeared in many machine/statistical learning models. Many Markov Chain Monte Carlo (MCMC) sampling methods have been adapted to handle different types of constraints on random variables. Among these methods, Hamilton Monte Carlo (HMC) and the related approaches have shown significant advantages in terms of computational efficiency compared with other counterparts. In this article, we first review HMC and some extended sampling methods, and then we concretely explain three constrained HMC‐based sampling methods, reflection, reformulation, and spherical HMC. For illustration, we apply these methods to solve three well‐known constrained sampling problems, truncated multivariate normal distributions, Bayesian regularized regression, and nonparametric density estimation. In this review, we also connect constrained sampling with another similar problem in the statistical design of experiments with constrained design space.This article is categorized under: Applications of Computational Statistics > Computational Mathematics Statistical and Graphical Methods of Data Analysis > Bayesian Methods and Theory Statistical and Graphical Methods of Data Analysis > Markov Chain Monte Carlo (MCMC) Statistical and Graphical Methods of Data Analysis > Sampling

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call