Abstract

Reduced bases have been introduced for the approximation of parametrized PDEs in applications where many online queries are required. Their numerical efficiency for such problems has been theoretically confirmed in Binev et al. (SIAM J. Math. Anal. 43 (2011) 1457–1472) and DeVore et al. (Constructive Approximation 37 (2013) 455–466), where it is shown that the reduced basis space Vn of dimension n, constructed by a certain greedy strategy, has approximation error similar to that of the optimal space associated to the Kolmogorov n-width of the solution manifold. The greedy construction of the reduced basis space is performed in an offline stage which requires at each step a maximization of the current error over the parameter space. For the purpose of numerical computation, this maximization is performed over a finite training set obtained through a discretization of the parameter domain. To guarantee a final approximation error ε for the space generated by the greedy algorithm requires in principle that the snapshots associated to this training set constitute an approximation net for the solution manifold with accuracy of order ε. Hence, the size of the training set is the ε covering number for M and this covering number typically behaves like exp(Cε−1/s) for some C > 0 when the solution manifold has n-width decay O(n−s). Thus, the shear size of the training set prohibits implementation of the algorithm when ε is small. The main result of this paper shows that, if one is willing to accept results which hold with high probability, rather than with certainty, then for a large class of relevant problems one may replace the fine discretization by a random training set of size polynomial in ε−1. Our proof of this fact is established by using inverse inequalities for polynomials in high dimensions.

Highlights

  • IntroductionComplex systems are frequently described by parametrized PDEs that take the general form

  • Complex systems are frequently described by parametrized PDEs that take the general form P(u, y) = 0. (1.1)Here y =j=1,...,d is a vector of parameters ranging over some domain Y ⊂ Rd and u = u(y) is the corresponding solution which is assumed to be uniquely defined in some Hilbert space V for every y ∈ Y .Keywords and phrases

  • The prohibitive number of error bound evaluations is the limiting factor in practice and poses the main obstruction to the feasibility of certified reduced basis methods in the regime of polynomially decaying n-widths, and in particular, in the context of high parametric dimension. We show that this obstruction can be circumvented by not searching for an ε-net of M but rather defining Ỹ︀ by random sampling of Y

Read more

Summary

Introduction

Complex systems are frequently described by parametrized PDEs that take the general form. Performance bounds, random sampling, entropy numbers, Kolmogorov n-widths, sparse high-dimensional polynomial approximation. In the typical case when the Kolmogorov width of M decays like O(n−s) for some s > 0, we can invoke Carl’s inequality [13] to obtain a sharp bound ecε−1/s for the cardinality of Mand Y This exponential growth drastically limits the possibility of using ε-approximation nets in practical applications when the number of involved parameters become large. Our main result shows that a target accuracy ε can generally be met with high probability by searching over a randomly discretized set Ywhose size grows only polynomially in ε−1 rather than exponentially, in contrast to ε-approximation nets.

Performance and complexity of reduced basis greedy algorithms
Polynomial approximation
The main result
Findings
Numerical illustration
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.