Abstract

This paper is concerned with the approximation of a function u u in a given subspace V m V_m of dimension m m from evaluations of the function at n n suitably chosen points. The aim is to construct an approximation of u u in V m V_m which yields an error close to the best approximation error in V m V_m and using as few evaluations as possible. Classical least-squares regression, which defines a projection in V m V_m from n n random points, usually requires a large n n to guarantee a stable approximation and an error close to the best approximation error. This is a major drawback for applications where u u is expensive to evaluate. One remedy is to use a weighted least-squares projection using n n samples drawn from a properly selected distribution. In this paper, we introduce a boosted weighted least-squares method which allows to ensure almost surely the stability of the weighted least-squares projection with a sample size close to the interpolation regime n = m n=m . It consists in sampling according to a measure associated with the optimization of a stability criterion over a collection of independent n n -samples, and resampling according to this measure until a stability condition is satisfied. A greedy method is then proposed to remove points from the obtained sample. Quasi-optimality properties in expectation are obtained for the weighted least-squares projection, with or without the greedy procedure. The proposed method is validated on numerical examples and compared to state-of-the-art interpolation and weighted least-squares methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call