Abstract

We develop an intuitive geometric framework for support vector regression (SVR). By examining when ε-tubes exist, we show that SVR can be regarded as a classification problem in the dual space. Hard and soft ε-tubes are constructed by separating the convex or reduced convex hulls, respectively, of the training data with the response variable shifted up and down by ε. A novel SVR model is proposed based on choosing the max-margin plane between the two shifted data sets. Maximizing the margin corresponds to shrinking the effective ε-tube. In the proposed approach, the effects of the choices of all parameters become clear geometrically. The kernelized model corresponds to separating the convex or reduced convex hulls in feature space. Generalization bounds for classification can be extended to characterize the generalization performance of the proposed approach. We propose a simple iterative nearest-point algorithm that can be directly applied to the reduced convex hull case in order to construct soft ε-tubes. Computational comparisons with other SVR formulations are also included.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call