Abstract

Support vector machines regression (SVMR) is an important part of statistical learning theory. The main difference between SVMR and the classical least squares regression (LSR) is that SVMR uses the ϵ-insensitive loss rather than quadratic loss to measure the empirical error. In this paper, we consider SVMR method in the field of functional data analysis under the framework of reproducing kernel Hilbert spaces. The main tool used in our theoretical analysis is the concentration inequalities for suprema of some appropriate empirical processes. As a result, we establish explicit convergence rates of the prediction risk for SVMR, which coincide with the minimax lower bound obtained recently in literature for LSR.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call