Abstract

Sensitivity Analysis (SA) and Uncertainty Quantification (UQ) are important components of modern numerical analysis. However, solving the relevant tasks involving large-scale fluid flow models currently remains a serious challenge. The difficulties are associated with the computational cost of running such models and with large dimensions of the discretized model input. The common approach is to construct a metamodel – an inexpensive approximation to the original model suitable for the Monte Carlo simulations. The polynomial chaos (PC) expansion is the most popular approach for constructing metamodels. Some fluid flow models of interest are involved in the process of variational estimation/data assimilation. This implies that the tangent linear and adjoint counterparts of such models are available and, therefore, computing the gradient (first derivative) and the Hessian (second derivative) of a given function of the state variables is possible. New techniques for SA and UQ which benefit from using the derivatives are presented in this paper. The gradient-enhanced regression methods for computing the PC expansion have been developed recently. An intrinsic step of the regression method is a minimization process. It is often assumed that generating ‘data’ for the regression problem is significantly more expensive that solving the regression problem itself. This depends, however, on the size of the importance set, which is a subset of ‘influential’ inputs. A distinguishing feature of the distributed parameter models is that the number of the such inputs could still be large, which means that solving the regression problem becomes increasingly expensive. In this paper we propose a derivative-enhanced projection method, where no minimization is required. The method is based on the explicit relationships between the PC coefficients and the derivatives, complemented with a relatively inexpensive filtering procedure. The method is currently limited to the PC expansion of the third order. Besides, we suggest an improved derivative-based global sensitivity measure. The numerical tests have been performed for the Burgers' model. The results confirm that the low-order PC expansion obtained by our method represents a useful tool for modeling the non-gaussian behavior of the chosen quantity of interest (QoI).

Highlights

  • Sensitivity Analysis (SA) and Uncertainty Quantification (UQ) are important components of modern numerical analysis [8]

  • In our paper we focus on a non-intrusive approach, where the polynomial chaos (PC) expansion describes the relationship between the model inputs and outputs directly

  • One useful bonus associated with the numerical models involved into variational estimation is the availability of the adjoint model which helps the gradient of the standard data mismatch cost-function to be calculated in a single model run

Read more

Summary

Introduction

Sensitivity Analysis (SA) and Uncertainty Quantification (UQ) are important components of modern numerical analysis [8]. The adjoint approach is computationally very efficient, its accuracy for nonlinear models depends on the validity of the tangent linear approximation for the error dynamics (case of weak nonlinearity and/or small uncertainties) This is a key assumption in the ensemble forecasting [27], since the method relies on computing the singular vectors of the propagator considered at the optimal solution point. One useful bonus associated with the numerical models involved into variational estimation is the availability of the adjoint model which helps the gradient (first derivative) of the standard data mismatch cost-function to be calculated in a single model run This adjoint model can be adapted to compute the gradient of any other QoI being defined as a function of the state variables. The main outcomes of the paper are summarized in the Conclusions, and major mathematical proofs and derivations are collected in the Appendixes

Problem setup
Explicit expressions for the PC coefficients
Global sensitivity analysis
Zero- and first-order coefficients
Second-order coefficients with filtering
Third-order coefficients with filtering
Adaptation of metamodels
Forward model
Derivatives
Numerical experiments
Filtering procedure
Probability density via histogram
CPU-time estimates
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call