Abstract

In this paper we study properties of the Laplace approximation of the posterior distribution arising in nonlinear Bayesian inverse problems. Our work is motivated by Schillings et al. (Numer Math 145:915–971, 2020. https://doi.org/10.1007/s00211-020-01131-1), where it is shown that in such a setting the Laplace approximation error in Hellinger distance converges to zero in the order of the noise level. Here, we prove novel error estimates for a given noise level that also quantify the effect due to the nonlinearity of the forward mapping and the dimension of the problem. In particular, we are interested in settings in which a linear forward mapping is perturbed by a small nonlinear mapping. Our results indicate that in this case, the Laplace approximation error is of the size of the perturbation. The paper provides insight into Bayesian inference in nonlinear inverse problems, where linearization of the forward mapping has suitable approximation properties.

Highlights

  • The study of Bayesian inverse problems [8,26] has gained wide attention during the last decade as the increase in computational resources and algorithmic development have enabled uncertainty quantification in numerous new applications in science andT

  • In this paper we study properties of the Laplace approximation of the posterior distribution arising in nonlinear Bayesian inverse problems

  • In this paper we study the Laplace approximation of the posterior distribution arising in nonlinear Bayesian inverse problems

Read more

Summary

Introduction

The study of Bayesian inverse problems [8,26] has gained wide attention during the last decade as the increase in computational resources and algorithmic development have enabled uncertainty quantification in numerous new applications in science and. Our results describe and quantify how the nonlinearity of the forward mapping translates into non-Gaussianity of the posterior distribution. Our work is motivated by a recent result by Schillings, Sprungk, and Wacker in [25], where the authors show that in the context of Bayesian inverse problems, the Laplace approximation error in Hellinger distance converges to zero in the order of the noise level. The nonlinearity of the forward mapping (more generally, the non-Gaussianity of the likelihood) or a large problem dimension can have a signifact contribution to the constant appearing in the asymptotic estimates. It is of interest to quantify such effects in non-asymptotic error estimates for the Laplace approximation This is the main goal of our work

Our contributions
Relevant literature
Organization of the paper
Preliminaries and set-up
Central error estimate
Optimal choice of the parameter
Explicit error estimate
Asymptotic behavior for fixed and increasing problem dimension
Perturbed linear problems with Gaussian prior
Outlook
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.