Abstract

Abstract A theoretical advantage of support vector machines (SVM) is the empirical and structural risk minimization which balances the complexity of the model against its success at fitting the training data. Metaheuristics have mostly been used with support vector machines to either tune hyperparameters or to perform feature selection. In this paper, we present a new approach to obtain sparse support vector machines (SVM) based on simulated annealing (SA), named SATE. In our proposal, SA was used to solve the quadratic optimization problem that emerges from support vector machines rather than tune the hyperparameters. We have compared our proposal with sequential minimal optimization (SMO), kernel adatron (KA), a usual QP solver, as well as with recent Particle Swarm Optimization (PSO) and Genetic Algorithms(GA)-based versions. Generally speaking, one can infer that the SATE is equivalent to SMO in terms of accuracy and mean of support vectors and sparser than KA, QP, LPSO, and GA. SATE also has higher accuracies than the GA and PSO-based versions. Moreover, SATE successfully embedded the SVM constraints and provides a competitive classifier while maintaining its simplicity and high sparseness in the solution.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.