Abstract

Bayesian computing, including sampling probability distributions, learning graphic model, and Bayesian reasoning, is a powerful class of machine learning algorithms with such wide applications as biologic computing, financial analysis, natural language processing, autonomous driving, and robotics. The central pattern of Bayesian computing is the Markov Chain Monte Carlo (MCMC) computing, which is compute-intensive and lacks explicit parallelism. In this work, we propose a parallel MCMC Bayesian computing accelerator (PMBA) architecture. Designed as a probabilistic computing platform with native support for efficient single-chain parallel Metropolis-Hastings based MCMC sampling, PMBA boosts the performance of probabilistic programs with a massive-parallelism microarchitecture. PMBA is equipped with on-chip random number generators as the built-in source of randomness. The sampling units of PMBA are designed for parallel random sampling through a customized SIMD pipeline supporting data synchronization every iteration. A respective computing framework supporting automatic parallelization and mapping of probabilistic programs is also developed. Evaluation results demonstrate that PMBA enables a 17-21 folds speedup over a TITAN X GPU on MCMC sampling workload. On probabilistic benchmarks, PMBA outperforms prior best solutions by factor of 3.6 to 10.3. An exemplar based visual category learning algorithm is implemented on PMBA to demonstrate its efficiency and effectiveness for complex statistical learning problems.

Highlights

  • As a foundation of modern statistics, Bayesian learning and inference theory provides efficient tools to evaluate and update beliefs in the presence of new observations

  • We extend a single instruction multiple thread (SIMT) instruction set [33] with various operations in sampling, including choosing and loading random number source, loading Markov Chain Monte Carlo (MCMC) states, generating sampling points and storing sampling results, to fully support MCMC process

  • We evaluate the performance of the implementation for Metropolis-Hastings MCMC on parallel MCMC Bayesian computing accelerator (PMBA) compared with multi-core CPU and GPU in subsection B

Read more

Summary

Introduction

As a foundation of modern statistics, Bayesian learning and inference theory provides efficient tools to evaluate and update beliefs in the presence of new observations. Due to its advantages in learning with small samples, Bayesian reasoning has received significant successes in the past a few decades [1]. An increasing amount of evidence suggests that Bayesian reasoning account for many essential cognitive processes of human beings [9], [10] due to its natural advantages in handling uncertainties and providing interpretable results. We first review the basic pattern of Bayesian computation and MCMC algorithm. The Bayesian computing is based on the Bayes’ Theorem, which offers a general way to derive the posterior distribution of a random variable. It indicates that statistical inference can be made as follows:

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call