Abstract
In this paper, we examine two widely-used approaches, the polynomial chaos expansion (PCE) and Gaussian process (GP) regression, for the development of surrogate models. The theoretical differences between the PCE and GP approximations are discussed. A state-of-the-art PCE approach is constructed based on high precision quadrature points; however, the need for truncation may result in potential precision loss; the GP approach performs well on small datasets and allows a fine and precise trade-off between fitting the data and smoothing, but its overall performance depends largely on the training dataset. The reproducing kernel Hilbert space (RKHS) and Mercer’s theorem are introduced to form a linkage between the two methods. The theorem has proven that the two surrogates can be embedded in two isomorphic RKHS, by which we propose a novel method named Gaussian process on polynomial chaos basis (GPCB) that incorporates the PCE and GP. A theoretical comparison is made between the PCE and GPCB with the help of the Kullback–Leibler divergence. We present that the GPCB is as stable and accurate as the PCE method. Furthermore, the GPCB is a one-step Bayesian method that chooses the best subset of RKHS in which the true function should lie, while the PCE method requires an adaptive procedure. Simulations of 1D and 2D benchmark functions show that GPCB outperforms both the PCE and classical GP methods. In order to solve high dimensional problems, a random sample scheme with a constructive design (i.e., tensor product of quadrature points) is proposed to generate a valid training dataset for the GPCB method. This approach utilizes the nature of the high numerical accuracy underlying the quadrature points while ensuring the computational feasibility. Finally, the experimental results show that our sample strategy has a higher accuracy than classical experimental designs; meanwhile, it is suitable for solving high dimensional problems.
Highlights
Computer simulations are widely used in learning tasks, where a single simulation is an instance of the system [1,2]
HG and H P are isomorphic as discussed in previous Section 3, so it is natural to come up with the idea that GP can be conducted with k(·, ·) as the Mercer Kernel generated by polynomial basis in the polynomial chaos expansion (PCE), and the new model is called Gaussian process on polynomial chaos basis (GPCB)
This paper has examined two different surrogates of computational models, i.e., polynomial chaos expansion and Gaussian process regression
Summary
Computer simulations are widely used in learning tasks, where a single simulation is an instance of the system [1,2]. They introduced a new meta-modeling method naming PC-kriging [29] (polynomial-chaos-based kriging) to solve the problems like rare event estimation [30], structural reliability analysis [31], quantile estimation [32], etc. In their papers, the PCE models can be viewed as a special form of GP where a Dirac function is introduced as the kernel. The PC-kriging model introduces the coefficients as parameters to be optimized, and the solution can be derived by Bayesian linear regression with the basis consisting of the PCE polynomials They use the LARSalgorithms to calibrate the model and to select a sparse design. In Part 2, an explicit Mehler kernel is presented with the Hermite polynomial basis in the last part of Section 4; several tests of the GPCB with some benchmark functions are presented in Section 5, along with the random constructive sampling method for high dimensional problems
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have