Surrogate models provide an affordable alternative to the evaluation of expensive deterministic functions. However, the construction of accurate surrogate models with many independent variables is currently prohibitive because they require a large number of function evaluations for the desired accuracy. Gradient-enhanced kriging has the potential to reduce the number of evaluations when efficient gradient computation, such as an adjoint method, is available. However, current gradient-enhanced kriging methods do not scale well with the number of sampling points because of the rapid growth in the size of the correlation matrix, where new information is added for each sampling point in each direction of the design space. Furthermore, they do not scale well with the number of independent variables because of the increase in the number of hyperparameters that must be estimated. To address this issue, we develop a new gradient-enhanced surrogate model approach that drastically reduces the number of hyperparameters through the use of the partial least squares method to maintain accuracy. In addition, this method is able to control the size of the correlation matrix by adding only relevant points defined by the information provided by the partial least squares method. To validate our method, we compare the global accuracy of the proposed method with conventional kriging surrogate models on two analytic functions with up to 100 dimensions, as well as engineering problems of varied complexity with up to 15 dimensions. We show that the proposed method requires fewer sampling points than conventional methods to obtain the desired accuracy, or it provides more accuracy for a fixed budget of sampling points. In some cases, we get models that are over three times more accurate than previously developed surrogate models for the same computational time, and over 3200 times faster than standard gradient-enhanced kriging models for the same accuracy.
Read full abstract