Abstract

Radial basis functions (RBFs) are prominent examples for reproducing kernels with associated reproducing kernel Hilbert spaces (RKHSs). The convergence theory for the kernel-based interpolation in that space is well understood and optimal rates for the whole RKHS are often known. Schaback added the doubling trick [Math. Comp. 68 (1999), pp. 201–216], which shows that functions having double the smoothness required by the RKHS (along with specific, albeit complicated boundary behavior) can be approximated with higher convergence rates than the optimal rates for the whole space. Other advances allowed interpolation of target functions which are less smooth, and different norms which measure interpolation error. The current state of the art of error analysis for RBF interpolation treats target functions having smoothness up to twice that of the native space, but error measured in norms which are weaker than that required for membership in the RKHS. Motivated by the fact that the kernels and the approximants they generate are smoother than required by the native space, this article extends the doubling trick to error which measures higher smoothness. This extension holds for a family of kernels satisfying easily checked hypotheses which we describe in this article, and includes many prominent RBFs. In the course of the proof, new convergence rates are obtained for the abstract operator considered by Devore and Ron in [Trans. Amer. Math. Soc. 362 (2010), pp. 6205–6229], and new Bernstein estimates are obtained relating high order smoothness norms to the native space norm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call