Abstract
Twin support vector regression (TSVR), Lagrangian TSVR (LTSVR) and $$\epsilon$$ -TSVR obtain good generalization and faster computational speed by solving a pair of smaller sized quadratic programming problems (QPPs) than a single large QPP in support vector regression (SVR). In this paper, a simple and linearly convergent Lagrangian support vector machine algorithm for the dual of the $$\epsilon$$ -TSVR is proposed. The contributions of our formulation are as follows: (1) we consider the square of the 2-norm of the vector of slack variables instead of the usual 1-norm to make the objective functions strongly convex. (2) We are solving regression problem with just two systems of linear equations as opposed to solving two QPPs in $$\epsilon$$ -TSVR and TSVR or one large QPP in SVR, which leads to extremely simple and fast algorithm. (3) One significant advantage of our proposed method is the implementation of structural risk minimization principle. However, only empirical risk is considered in the primal problems of TSVR and LTSVR due to its complex structure and thus may incur overfitting and suboptimal in some cases. (4) The experimental results on several artificial and benchmark datasets show the effectiveness of our proposed formulation.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal of Machine Learning and Cybernetics
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.