Abstract

We have recently proposed a new approach to control the number of basis functions and the accuracy in support vector machines. The latter is transferred to a linear programming setting, which inherently enforces sparseness of the solution. The algorithm computes a nonlinear estimate in terms of kernel functions and an /spl epsiv/>0 with the property that at most a fraction /spl nu/ of the training set has an error exceeding /spl epsiv/. The algorithm is robust to local perturbations of these points' target values. We give an explicit formulation of the optimization equations needed to solve the linear program and point out which modifications of the standard optimization setting are necessary to take advantage of the particular structure of the equations in the regression case.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call