Abstract

Bayesian model updating based on Gaussian Process (GP) models has received attention in recent years, which incorporates kernel-based GPs to provide enhanced fidelity response predictions. Although most kernel functions provide high fitting accuracy in the training data set, their out-of-sample predictions can be highly inaccurate. This paper investigates this problem by reformulating the problem on a consistent probabilistic foundation, reviewing common choices of kernel covariance functions, and proposing a new Bayesian model selection for kernel function selection, aiming to create a balance between fitting accuracy, generalizability, and model parsimony. Computational aspects are addressed via Laplace approximation and sampling techniques, providing detailed algorithms and strategies. Numerical and experimental examples are included to demonstrate the accuracy and robustness of the proposed framework. As a result, an exponential-trigonometric covariance function is characterized and justified based on the Bayesian model selection approach and observations of the sample autocorrelation function of the response discrepancies.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call