Abstract

We present a methodology to obtain a correct sampling of the posterior probability density function (pdf) conditional to observations where this posterior pdf can be formally expressed using Bayes’ theorem. Generating a correct sampling of a multimodal posterior pdf is a challenging task which can only be achieved with Markov chain Monte Carlo (MCMC) methods. In standard MCMC such as random-walk MCMC, evaluation of acceptance probability for a proposed state requires a forward model run (a reservoir simulation run). When the forward model run is computationally expensive, we cannot afford to generate a long Markov chains with tens of thousands or more states. Therefore, it is critically important to design the MCMC such that it converges to the posterior pdf after generating a few thousand or less states. Here, a two-level MCMC procedure which can sample multimodal posteriors relatively efficiently is developed and applied. In the first step, we use the distributed Gauss-Newton (DGN) method to generate many modes of the posterior pdf in parallel; this procedure estimates sensitivity matrices without the need of an adjoint solution. A Gaussian mixture model (GMM) is then constructed based on the distinct modes that we find in the first step. In the second step, the constructed GMM is used as the proposal distribution for our MCMC algorithm. Because the proposal distribution is constructed as a direct approximation of the target pdf (without the normalizing constant), the Markov chain(s) constructed should converge relatively quickly to the posterior distribution and applications of the two-level MCMC algorithm to test problems show that our proposed two-level MCMC is far more efficient than the random-walk MCMC.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call