Abstract

This paper presents a new approach for pruning dataset features (i.e., feature selection) based on genetic algorithms (GAs) and sparse least squares support vector machines (LSSVM) for classification tasks. LSSVM is a modified version of standard Support Vector Machine (SVM), which is in general faster to train than SVM since the training process of SVM requires the solution of a quadratic programming problem while the LSSVM demands only the solution of a linear equation system. GAs are applied to solve optimization problems without the assumption of linearity, differentiability, continuity or convexity of the objective function. There are some works where GAs and LSSVM work together, however, mostly to find the LSSVM kernel and/or classifier parameters. Nevertheless, our new proposal combines LSSVM and GAs for achieving sparse models, in which each support vector has just a few features in a feature selection sense. The idea behind our proposal is to remove non-relevant features from the patterns by using GAs. Removing a pattern has less impact than removing a feature since the training dataset has in general more patterns than features. On the basis of the results, our proposal leaves non-relevant features out of the set of features and still maintains or even improves the classification accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call