Abstract

This letter adopts a GA (Genetic Algorithm) approach to assist in learning scaling of features that are most favorable to SVM (Support Vector Machines) classifier, which is named as GA-SVM. The relevant coefficients of various features to the classification task, measured by real-valued scaling, are estimated efficiently by using GA. And GA exploits heavy-bias operator to promote sparsity in the scaling of features. There are many potential benefits of this method: Feature selection is performed by eliminating irrelevant features whose scaling is zero, an SVM classifier that has enhanced generalization ability can be learned simultaneously. Experimental comparisons using original SVM and GA-SVM demonstrate both economical feature selection and excellent classification accuracy on junk e-mail recognition problem and Internet ad recognition problem. The experimental results show that comparing with original SVM classifier, the number of support vector decreases significantly and better classification results are achieved based on GA-SVM. It also demonstrates that GA can provide a simple, general, and powerful framework for tuning parameters in optimal problem, which directly improves the recognition performance and recognition rate of SVM.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call