Abstract

We propose the application of iterative regularization for the development of ensemble methods for solving Bayesian inverse problems. In concrete, we construct (i) a variational iterative regularizing ensemble Levenberg-Marquardt method (IR-enLM) and (ii) a derivative-free iterative ensemble Kalman smoother (IR-ES). The aim of these methods is to provide a robust ensemble approximation of the Bayesian posterior. The proposed methods are based on fundamental ideas from iterative regularization methods that have been widely used for the solution of deterministic inverse problems (Katltenbacher et al. de Gruyter, Berlin 2008). In this work, we are interested in the application of the proposed ensemble methods for the solution of Bayesian inverse problems that arise in reservoir modeling applications. The proposed ensemble methods use key aspects of the regularizing Levenberg-Marquardt scheme developed by Hanke (Inverse Problems 13,79–95 1997) and that we recently applied for history matching in Iglesias (Comput. Geosci. 1–21 2013). Unlike most existing methods where the stopping criteria and regularization parameters are typically selected heuristically, in the proposed ensemble methods, the discrepancy principle is applied for (i) the selection of the regularization parameters and (ii) the early termination of the scheme. The discrepancy principle is key for the theory of iterative regularization, and the purpose of the present work is to apply this principle for the development of ensemble methods defined as iterative updates of solutions to linear ill-posed inverse problems. The regularizing and convergence properties of iterative regularization methods for deterministic inverse problems have long been established. However, the approximation properties of the proposed ensemble methods in the context of Bayesian inverse problems is an open problem. In the case where the forward operator is linear and the prior is Gaussian, we show that the tunable parameters of the proposed IR-enLM and IR-ES can be chosen so that the resulting schemes coincide with the standard randomized maximum likelihood (RML) and the ensemble smoother (ES), respectively. Therefore, the proposed methods sample from the posterior in the linear-Gaussian case. Similar to RML and ES methods, in the nonlinear case, one may not conclude that the proposed methods produce samples from the posterior. The present work provides a numerical investigation of the performance of the proposed ensemble methods at capturing the posterior. In particular, we aim at understanding the role of the tunable parameters that arise from the application of iterative regularization techniques. The numerical framework for our investigations consists of using a state-of-the art Markov chain Monte Carlo (MCMC) method for resolving the Bayesian posterior from synthetic experiments. The resolved posterior via MCMC then provides a gold standard against to which compare the proposed IR-enLM and IR-ES. Our numerical experiments show clear indication that the regularizing properties of the regularization methods applied for the computation of each ensemble have significant impact of the approximation properties of the proposed ensemble methods at capturing the Bayesian posterior. Furthermore, we provide a comparison of the proposed regularizing methods with respect to some unregularized methods that have been typically used in the literature. Our numerical experiments showcase the advantage of using iterative regularization for obtaining more robust and stable approximation of the posterior than unregularized methods.

Highlights

  • We use key elements of iterative regularization techniques to develop robust ensemble methods for solving Bayesian inverse problems that arise in subsurface flow

  • In the case where the forward operator is linear and the prior is Gaussian, we show that the tunable parameters of the proposed iterative regularizing ensemble Levenberg-Marquardt method (IR-enLM) and iterative ensemble Kalman smoother (IR-ES) can be chosen so that the resulting schemes coincide with the standard randomized maximum likelihood (RML) and the ensemble smoother (ES), respectively

  • Our MCMC results provide a gold-standard that we use to investigate the performance of IR-enLM and IR-ES at capturing aspects of the Bayesian posterior

Read more

Summary

Introduction

We use key elements of iterative regularization techniques to develop robust ensemble methods for solving Bayesian inverse problems that arise in subsurface flow. A novel aspect of these ensemble methods is that the discrepancy principle is used for (i) the selection of the regularization parameter that stabilizes the update of each ensemble member and (ii) the stopping criteria that avoids data overfitting These strategies have been used for theoretically establishing the convergence and regularizing properties of some well-known iterative regularization methods aimed at solving deterministic nonlinear ill-posed inverse problems. Due to the resulting large size of the space X, the computation of a minimizer of (1) is unstable (ill-posed) in the sense that an arbitrarily small data misfit (1) may not necessarily correspond to an estimates u that is close to the optimal This ill-posedness, that arises from the mathematical structure of G, requires regularization. It used the prior error covariance C for the regularization that was built into the iterative scheme

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call