Abstract

Twin support vector regression (TSVR), as an effective regression machine, solves a pair of smaller-sized quadratic programming problems (QPPs) rather than a single large one as in the classical support vector regression (SVR), which makes the learning speed of TSVR approximately 4 times faster than that of the conventional SVR. However, the empirical risk minimization principle is implemented in TSVR, which reduces its generalization ability to a certain extent. In order to improve the prediction accuracy and stability of algorithm, we propose a novel TSVR for the regression problem by introducing a regularization term into the objective function, which ensures the new algorithm implements the structural risk minimization principle instead of the empirical risk minimization principle. Moreover, the up- and down-bound functions obtained in our algorithm are as parallel as possible. Thus it ensures that our proposed algorithm yields lower prediction error and lower standard deviation in theory. The experimental results on one artificial dataset and six benchmark datasets indicate the feasibility and validity of our novel TSVR.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.