Abstract

Probabilistic models provide a set of powerful tools to deal with uncertainty that is pervasive in machine learning applications. Probabilistic programming utilizes computer programs to represent probabilistic models, and it supports the functions of sampling as well as probabilistic inference conditioned on arbitrary observations. Traditionally, the dependency relationship in probabilistic programming is mainly linear or generalized linear, which serves as the basis of many successful models and inference algorithms. However, such linearity also limits the expressiveness and flexibility of probabilistic programs. Differentiable probabilistic programming allows probabilistic programs to have nonlinear dependency under a proper parameterization form (e.g., neural networks) and can learn unknown parameters from data via gradient-based methods. This programming paradigm is easy to extend, largely avoids the tedious model selection process, and makes the deployment of probabilistic models possible in an end-to-end manner. This article presents ZhuSuan, an open-source library for differentiable probabilistic programming. Taking ZhuSuan as an example, we discuss the design and implementation of differentiable probabilistic programming systems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call