Abstract

In this paper, to improve the performance of capped L1-norm twin support vector machine (CTSVM), we first propose a new robust twin bounded support vector machine (RTBSVM) by introducing the regularization term. The significant advantage of our RTBSVM over CTSVM is that the structural risk minimization principle is implemented. This embodies the marrow of statistical learning theory, so this modification can improve the performance of classification. Furthermore, to accelerate the computation of RTBSVM and simultaneously inherit the merit of robustness, we construct a least squares version of RTBSVM (called RTBSVM). This formulation leads to a simple and fast algorithm for binary classifiers by solving just two systems of linear equations. Finally, we derive two simple and effective iterative optimization algorithms for solving RTBSVM and FRTBSVM, respectively. Simultaneously, we theoretically rigorously analyze and prove the computational complexity, local optimality and convergence of the algorithms. Experimental results on one synthetic dataset and nine UCI datasets demonstrate that our methods are competitive with other methods. Additionally, the FRTBSVM is directly applied to recognize the purity of hybrid maize seeds using near-infrared spectral data. Experiments show that our method achieves better performance than the traditional methods in most spectral regions.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.