Abstract

We present a new surrogate modeling technique for efficient approximation of input-output maps governed by parametrized PDEs. The model is hierarchical as it is built on a full order model, reduced order model (ROM), and machine learning (ML) model chain. The model is adaptive in the sense that the ROM and ML model are adapted on the fly during a sequence of parametric requests to the model. To allow for a certification of the model hierarchy, as well as to control the adaptation process, we employ rigorous a posteriori error estimates for the ROM and ML models. In particular, we provide an example of an ML-based model that allows for rigorous analytical quality statements. We demonstrate the efficiency of the modeling chain on a Monte Carlo and a parameter-optimization example. Here, the ROM is instantiated by Reduced Basis methods, and the ML model is given by a neural network or by a kernel model using vectorial kernel orthogonal greedy algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call