Abstract

Bayesian inference is a key method for estimating parametric uncertainty from data. However, most Bayesian inference methods require the explicit likelihood function or many samples, both of which are unrealistic to provide for complex first-principles-based models. Here, we propose a novel Bayesian inference methodology for estimating uncertain parameters of computationally intensive first-principles-based models. Our approach exploits both low-complexity surrogate models and variational inference with arbitrarily expressive inference models. The proposed methodology indirectly predicts output responses and casts Bayesian inference as an optimization problem. We demonstrate its performance via synthetic problems, computational fluid dynamics, and kinetic Monte Carlo simulation to verify its applicability. This fast and reliable methodology enables us to capture multimodality and the shape of complicated posterior distributions with the quality of state-of-the-art Hamiltonian Monte Carlo methods but with much lower computation cost.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.