Abstract

Leave-one-out cross-validation (LOOCV) is a widely used technique in model estimation and selection of the Kriging surrogate model for engineering problems, such as structural optimization and reliability analysis. However, the traditional LOOCV method has some disadvantages in terms of accuracy and efficiency. This paper proposes an enhanced-LOOCV method that incorporates hyperparameters from the Kriging model based on the complete training dataset (i.e., the complete Kriging model) into the LOOCV error calculation. By keeping the model hyperparameters in LOOCV consistent with the complete Kriging model, it reduces the number of hyperparameter optimizations and significantly increases the accuracy and efficiency of the LOOCV process. Additionally, a decremental calculation is proposed to reduce the computational costs of the correlation matrix inversion without sacrificing accuracy, resulting in an improved time complexity of the traditional LOOCV from O(n4) to O(n3). Experimental results with thirty test functions verify that the enhanced-LOOCV has better estimation performance than the Kriging model with significantly higher efficiency compared to the traditional LOOCV. Numerical experiments and an engineering case in optimization demonstrate that the enhanced-LOOCV can reduce the number of samples needed for infilling in the Kriging model, and it is more suitable for expensive optimizations in engineering.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call