Abstract

In 2000, Kennedy and O’Hagan proposed a model for uncertainty quantification that combines data of several levels of sophistication, fidelity, quality, or accuracy, e.g., a coarse and a fine mesh in finite-element simulations. They assumed each level to be describable by a Gaussian process, and used low-fidelity simulations to improve inference on costly high-fidelity simulations. Departing from there, we move away from the common non-Bayesian practice of optimization and marginalize the parameters instead. Thus, we avoid the awkward logical dilemma of having to choose parameters and of neglecting that choice’s uncertainty. We propagate the parameter uncertainties by averaging the predictions and the prediction uncertainties over all the possible parameters. This is done analytically for all but the nonlinear or inseparable kernel function parameters. What is left is a low-dimensional and feasible numerical integral depending on the choice of kernels, thus allowing for a fully Bayesian treatment. By quantifying the uncertainties of the parameters themselves too, we show that “learning” or optimising those parameters has little meaning when data is little and, thus, justify all our mathematical efforts. The recent hype about machine learning has long spilled over to computational engineering but fails to acknowledge that machine learning is a big data problem and that, in computational engineering, we usually face a little data problem. We devise the fully Bayesian uncertainty quantification method in a notation following the tradition of E.T. Jaynes and find that generalization to an arbitrary number of levels of fidelity and parallelisation becomes rather easy. We scrutinize the method with mock data and demonstrate its advantages in its natural application where high-fidelity data is little but low-fidelity data is not. We then apply the method to quantify the uncertainties in finite element simulations of impedance cardiography of aortic dissection. Aortic dissection is a cardiovascular disease that frequently requires immediate surgical treatment and, thus, a fast diagnosis before. While traditional medical imaging techniques such as computed tomography, magnetic resonance tomography, or echocardiography certainly do the job, Impedance cardiography too is a clinical standard tool and promises to allow earlier diagnoses as well as to detect patients that otherwise go under the radar for too long.

Highlights

  • While Uncertainty Quantification (UQ) has become a term on its own in the computational engineering community, Bayesian Probability Theory is not widely spread yet

  • We apply our method to quantify the uncertainties in finite element simulations [18] of Impedance Cardiography (ICG) [19] of Aortic Dissection (AD) [20]

  • We devised a fully Bayesian multi-level Gaussian process model to improve uncertainty quantification of expensive and little high-fidelity simulation data by augmenting the data set with low-fidelity simulations

Read more

Summary

Introduction

While Uncertainty Quantification (UQ) has become a term on its own in the computational engineering community, Bayesian Probability Theory is not widely spread yet. We apply our method to quantify the uncertainties in finite element simulations [18] of Impedance Cardiography (ICG) [19] of Aortic Dissection (AD) [20]. In CT and MRT, the radiation fully penetrates the body Still, they require a trained radiologist and long measurement times and pose radiation risks and high costs. Impedance cardiography could complement existing clinical procedures and could detect aortic dissection when medical imaging is not performed, be it due to the absence of suspicion or to the unavailablity of the device itself.

Statistical Model
Prediction and Its Uncertainty
Algorithm and Mock Data Scrutiny
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call