In the framework of the estimation of safety margins in nuclear accident analysis, a quantitative assessment of the uncertainties tainting the results of computer simulations is essential. Accurate uncertainty propagation (estimation of high probabilities or quantiles) and quantitative sensitivity analysis may call for several thousand code simulations. Complex computer codes, as the ones used in thermal-hydraulic accident scenario simulations, are often too CPU-time expensive to be directly used to perform these studies. A solution consists in replacing the computer model by a CPU-inexpensive mathematical function, called a metamodel, built from a reduced number of code simulations. However, in case of high-dimensional experiments (with typically several tens of inputs), the metamodel building process remains difficult. To face this limitation, we propose a methodology which combines several advanced statistical tools: initial space-filling design, screening to identify the noninfluential inputs, and Gaussian process (Gp) metamodel building with the group of influential inputs as explanatory variables. The residual effect of the group of noninfluential inputs is captured by another Gp metamodel. Then, the resulting joint Gp metamodel is used to accurately estimate Sobol’ sensitivity indices and high quantiles (here 95% quantile). The efficiency of the methodology to deal with a large number of inputs and reduce the calculation budget is illustrated on a thermal-hydraulic calculation case simulating with the CATHARE2 code a loss-of-coolant accident scenario in a pressurized water reactor. A predictive Gp metamodel is built with only a few hundred code simulations which allows the calculation of the Sobol’ sensitivity indices. This Gp also provides a more accurate estimation of the 95% quantile and associated confidence interval than the empirical approach, at equal calculation budget. Moreover, on this test case, the joint Gp approach outperforms the simple Gp.
Read full abstract