Abstract

Sobol' sensitivity indices allow to quantify the respective effects of random input variables and their combinations on the variance of mathematical model output. We focus on the problem of Sobol' indices estimation via a metamodeling approach where we replace the true mathematical model with a sample-based approximation to compute sensitivity indices. We propose a new method for indices quality control and obtain asymptotic and non-asymptotic risk bounds for Sobol' indices estimates based on a general class of metamodels. Our analysis is closely connected with the problem of nonparametric function fitting using the orthogonal system of functions in the random design setting. It considers the relation between the metamodel quality and the error of the corresponding estimator for Sobol' indices and shows the possibility of fast convergence rates in the case of noiseless observations. The theoretical results are complemented with numerical experiments for the approximations based on multivariate Legendre and Trigonometric polynomials.

Highlights

  • We address the problem of the accuracy and risk of metamodels-based Sobol’ indices in the random design setting

  • We obtained a general relation between the accuracy of arbitrary metamodel and the error of estimated Sobol’ indices; and the specific asymptotic and non-asymptotic relations for the two methods of parameters estimation for metamodels with tensor structure, including approximations based on multivariate Legendre, Chebyshev, and Trigonometric polynomials

  • Based on the obtained relation, we propose a method for Sobol’ indices quality control that demonstrates a good performance compared to an existing approach

Read more

Summary

Introduction

They have become so complicated that there is an increasing need for special methods of their analysis. This analysis is even more important since the advent of artificial neural networks. Being an important tool for investigation of computational models, sensitivity analysis tries to find how different model input parameters influence the model output, what are the most influential parameters, and how to evaluate such effects quantitatively [41]. Sensitivity analysis allows to understand the behavior of computational models better. I.e. parameters whose variability has a strong effect on the model output, may need to be controlled more accurately. As complex computational models often suffer from overparametrization, by excluding unimportant parameters, we can potentially improve model quality, reduce parametrization and computational costs [18]

Methods
Findings
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.