Abstract
Bayesian inverse problems often involve sampling posterior distributions on infinite-dimensional function spaces. Traditional Markov chain Monte Carlo (MCMC) algorithms are characterized by deteriorating mixing times upon mesh-refinement, when the finite-dimensional approximations become more accurate. Such methods are typically forced to reduce step-sizes as the discretization gets finer, and thus are expensive as a function of dimension. Recently, a new class of MCMC methods with mesh-independent convergence times has emerged. However, few of them take into account the geometry of the posterior informed by the data. At the same time, recently developed geometric MCMC algorithms have been found to be powerful in exploring complicated distributions that deviate significantly from elliptic Gaussian laws, but are in general computationally intractable for models defined in infinite dimensions. In this work, we combine geometric methods on a finite-dimensional subspace with mesh-independent infinite-dimensional approaches. Our objective is to speed up MCMC mixing times, without significantly increasing the computational cost per step (for instance, in comparison with the vanilla preconditioned Crank–Nicolson (pCN) method). This is achieved by using ideas from geometric MCMC to probe the complex structure of an intrinsic finite-dimensional subspace where most data information concentrates, while retaining robust mixing times as the dimension grows by using pCN-like methods in the complementary subspace. The resulting algorithms are demonstrated in the context of three challenging inverse problems arising in subsurface flow, heat conduction and incompressible flow control. The algorithms exhibit up to two orders of magnitude improvement in sampling efficiency when compared with the pCN method.
Highlights
In this work we consider Bayesian inverse problems where the objective is to identify an unknown function parameter u which is an element of a separable Hilbert space (X, ·, ·, |·|)
Unlike recent development of geometric methods including Stochastic Newton (SN) MCMC [9] and Riemannian manifold Hamiltonian Monte Carlo for large-scale PDE-constrained inverse problems [10], these proposed advanced MCMC algorithms are well-defined on the Hilbert space
We review some of the advanced MCMC methods published in the literature, see e.g. [1,2,3] or [7] for recent contributions
Summary
In this work we consider Bayesian inverse problems where the objective is to identify an unknown function parameter u which is an element of a separable Hilbert space (X, ·, · , |·|). Sampling from μy in the context of PDE-constrained inverse problems is typically a very challenging undertaking due to the high-dimensionality of the target, the non-Gaussianity of the posterior and the computational burden of repeated PDE solutions for evaluating the likelihood function at different parameters It is well-understood that traditional Metropolis-Hastings algorithms have deteriorating mixing times upon refinement of the mesh-size used in practice in the finite-dimensional projection of parameter u. Unlike recent development of geometric methods including Stochastic Newton (SN) MCMC [9] and Riemannian manifold Hamiltonian Monte Carlo for large-scale PDE-constrained inverse problems [10], these proposed advanced MCMC algorithms are well-defined on the Hilbert space They have the capacity to both explore complex probability structures and have robust mixing times in high dimensions.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.