Abstract

In this paper, a new linear programming formulation of a 1-norm support vector regression (SVR) is proposed whose solution is obtained by solving an exterior penalty problem in the dual space as an unconstrained minimization problem using Newton method. The solution of modified unconstrained minimization problem reduces to solving just system of linear equations as opposed to solving quadratic programming problem in SVR, which leads to extremely simple and fast algorithm. The algorithm converges from any starting point and can be easily implemented in MATLAB without using any optimization packages. The main advantage of the proposed approach is that it leads to a robust and sparse model representation meaning that many components of the optimal solution vector will become zero and therefore the decision function can be determined using much less number of support vectors in comparison to SVR, smooth SVR (SSVR) and weighted SVR (WSVR). To demonstrate its effectiveness, experiments were performed on well-known synthetic and real-world benchmark datasets. Similar or better generalization performance of the proposed method in less training time in comparison with SVR, SSVR and WSVR clearly exhibits its suitability and applicability.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.