Abstract

Many computer models possess high-dimensional input spaces and substantial computational time to produce a single model evaluation. Although such models are often ‘deterministic’, these models suffer from a wide range of uncertainties. We argue that uncertainty quantification is crucial for computer model validation and reproducibility. We present a statistical framework, termed history matching, for performing global parameter search by comparing model output to the observed data. We employ Gaussian process (GP) emulators to produce fast predictions about model behaviour at the arbitrary input parameter settings allowing output uncertainty distributions to be calculated. History matching identifies sets of input parameters that give rise to acceptable matches between observed data and model output given our representation of uncertainties. Modellers could proceed by simulating computer models’ outputs of interest at these identified parameter settings and producing a range of predictions. The variability in model results is crucial for inter-model comparison as well as model development. We illustrate the performance of emulation and history matching on a simple one-dimensional toy model and in application to a climate model.This article is part of the theme issue ‘Reliability and reproducibility in computational science: implementing verification, validation and uncertainty quantification in silico’.

Highlights

  • A computer model is a coded representation of a true process of interest

  • Galform, a model of galaxy formation, is employed to study the behaviour of galaxies in the presence of dark matter [2]. Another example of powerful and complex models are the threedimensional General circulation models (GCMs) of the atmosphere and ocean, numerical models based on the calculation of the budgets of mass, energy and momentum on a grid of columns on a sphere [3]

  • Analytical models are used in many ways from appraising and evaluating the impact of policy options to planning the current strategy based on future forecasts [6,7]

Read more

Summary

Introduction

A computer model (simulator) is a coded representation of a true process of interest. Galform, a model of galaxy formation, is employed to study the behaviour of galaxies in the presence of dark matter [2] Another example of powerful and complex models are the threedimensional General circulation models (GCMs) of the atmosphere and ocean, numerical models based on the calculation of the budgets of mass, energy and momentum on a grid of columns on a sphere [3]. The probabilities and conditional probabilities are employed to represent modellers’ uncertainty about the quantities of interest and observations about these quantities, respectively For complex applications such as climate modelling, subjective Bayesian approach is the only way to perform model-based inference about the physical process of interest by combining limited data with expert knowledge.

Sources of uncertainty
Methodologies in UQ
Example
Findings
Discussion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call