Abstract
This paper introduces a new approach to building sparse least square support vector machines (LSSVM) based on multi-objective genetic algorithms (GAs) for classification tasks. LSSVM classifiers are an alternative to SVM ones due to the training process of LSSVM classifiers only requires to solve a linear equation system instead of a quadratic programming optimization problem. However, the lost of sparseness in the Lagrange multipliers vector (i.e. the solution) is a significant drawback which comes out with theses classifiers. In order to overcome this lack of sparseness, we propose a multi-objective GA approach to leave a few support vectors out of the solution without affecting the classifier’s accuracy and even improving it. The main idea is to leave out outliers, non-relevant patterns or those ones which can be corrupted with noise and thus prevent classifiers to achieve higher accuracies along with a reduced set of support vectors. We point out that the resulting sparse LSSVM classifiers achieve equivalent (in some cases, superior) performances than standard full-set LSSVM classifiers over real data sets. Differently from previous works, genetic algorithms are used in this work to obtain sparseness not to find out the optimal values of the LSSVM hyper-parameters.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.