Abstract

A reproducing kernel Hilbert space (RKHS) approximation problem arising from learning theory is investigated. Some K-functionals and moduli of smoothness with respect to RKHSs are defined with Fourier–Bessel series and Fourier–Bessel transforms, respectively. Their equivalent relation is shown, with which the upper bound estimate for the best RKHS approximation is provided. The convergence rate is bounded with the defined modulus of smoothness, which shows that the RKHS approximation can attain the same approximation ability as that of the Fourier–Bessel series and Fourier–Bessel transform. In particular, it is shown that for a RKHS produced by the Bessel operator, the convergence rate sums up to the bound of a corresponding convolution operator approximation. The investigations show some new applications of Bessel functions. The results obtained can be used to bound the approximation error in learning theory.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call