Abstract

Many generative models can be expressed as a differentiable function applied to input variables sampled from a known probability distribution. This framework includes both the generative component of learned parametric models such as variational autoencoders and generative adversarial networks, and also procedurally defined simulator models which involve only differentiable operations. Though the distribution on the input variables to such models is known, often the distribution on the output variables is only implicitly defined. We present a method for performing efficient Markov chain Monte Carlo inference in such models when conditioning on observations of the model output. For some models this offers an asymptotically exact inference method where approximate Bayesian computation might otherwise be employed. We use the intuition that computing conditional expectations is equivalent to integrating over a density defined on the manifold corresponding to the set of inputs consistent with the observed outputs. This motivates the use of a constrained variant of Hamiltonian Monte Carlo which leverages the smooth geometry of the manifold to move between inputs exactly consistent with observations. We validate the method by performing inference experiments in a diverse set of models.

Highlights

  • As well as parametric models learnt from data, the DGM framework encapsulates a wide class of simulator models where is defined procedurally, e.g. numerical integration of a system of stochastic differential equations

  • We have presented a generally applicable framework for performing inference in differentiable generative models

  • Though simulating the constrained Hamiltonian dynamic is computationally costly, the resulting coherent exploration of the state space can lead to significantly improved sampling efficiency over alternative methods

Read more

Summary

Introduction

As well as parametric models learnt from data, the DGM framework encapsulates a wide class of simulator models where is defined procedurally, e.g. numerical integration of a system of stochastic differential equations. The operations used in many simulations are differentiable and automatic differentiation provides a computationally efficient framework for calculating the exact derivatives of a differentiable simulator’s outputs with respect to its random inputs given just the code used to define the model. Often we will be interested in using a generative model to make inferences about the modelled variables given observations related to outputs of the model. Given a simulator of a physical process and generator of parameters of the process which we believe are reasonable a priori, we may wish to infer our posterior beliefs about the parameters under the model given observations of the physical process. In directed models this usually corresponds to not being able to evaluate the likelihood [ | ]. The lack of a closed form density impedes the direct use of approximate inference methods such as variational inference and Markov chain Monte Carlo (MCMC)

Approximate Bayesian Computation
Constrained Hamiltonian Monte Carlo
Method
Related work
Experiments
Lotka–Volterra parameter inference
Human pose and camera model inference
MNIST in-painting
Discussion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call