Abstract

Abstract We analyze reduced basis (RB for short) short acceleration of recently proposed sparse Bayesian inversion algorithms for partial differential equations with uncertain distributed parameter, for observation data subject to additive, Gaussian observation noise. Specifically, Bayesian inversion of affine-parametric, linear operator families on possibly high-dimensional parameter spaces. We consider “high-fidelity” Petrov–Galerkin (PG) discretizations of these countably-parametric operator families: we allow general families of inf–sup stable, PG Finite-Element methods, covering most conforming primal and mixed Finite-Element discretizations of standard problems in mechanics. RB acceleration of the high-dimensional, parametric forward response maps which need to be numerically solved numerous times in Bayesian inversion is proposed and convergence rate bounds for the error in the Bayesian estimate incurred by the use of RB are derived. As consequence of recent theoretical results on dimension-independent sparsity of parametric responses, and preservation of sparsity for holomorphic–parametric problems, we establish new convergence rates of greedy RB compressions for both, the parametric forward maps as well as for the countably-parametric posterior densities which arise in Bayesian inversion. We show that the convergence rates for the RB compressions of the parametric forward maps as well as of the countably-parametric, sparse Bayesian posterior densities are free from the curse of dimensionality and depend only on the sparsity of the uncertain input data. In particular, we establish the quadratic convergence of the RB compression for the posterior densities with respect to that for the parametric forward maps. Numerical experiments for linear elliptic, affine-parametric model problems in two space dimensions with hundreds of parameters are reported which confirm that the proposed adaptive sparse grid reduced basis algorithms indeed exploit sparsity of both, the parametric forward maps as well as the Bayesian posterior density.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call