Abstract
A theoretical advantage of large margin classifiers such as support vector machines (SVM) concerns the empirical and structural risk minimization which balances the complexity of the model against its success at fitting the training data. Metaheuristics have been used to work with SVMs in order to select features, tune hypeparameters or even achieve a reduced-set of support vectors. In spite of such tasks being interesting, metaheuristics such as simulated annealing (SA) do not play an important role in the process of solving the quadratic optimization problem, which arises from support vector machines. To do so, well-known methods such as sequential minimal optimization, kernel adatron or even classical mathematical methods have been used with this goal. In this paper, we propose to use simulated annealing in order to solve such a quadratic optimization problem. Our proposal is interesting when compared with those aforementioned methods, since it is simple and achieved similar (or even higher) accuracy and high sparseness in the solution.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.