Abstract

A novel training method has been proposed for increasing efficiency and generalization of support vector machine (SVM). The efficiency of SVM in classification is directly determined by the number of the support vectors used, which is often huge in the complicated classification problem in order to represent a highly convoluted separation hypersurface for better nonlinear classification. However, the separation hypersurface of SVM might be unnecessarily over-convoluted around extreme outliers, as these outliers can easily dominate the objective function of SVM. This situation eventually affects the efficiency and generalization of SVM in classifying unseen testing samples. To avoid this problem, we propose a novel objective function for SVM, i.e., an adaptive penalty term is designed to suppress the effects of extreme outliers, thus simplifying the separation hypersurface and increasing the classification efficiency. Since maximization of the margin distance of hypersurface is no longer dominated by those extreme outliers, our generated SVM tends to have a wider margin, i.e., better generalization ability. Importantly, as our designed objective function can be reformulated as a dual problem, similar to that of standard SVM, any existing SVM training algorithm can be borrowed for the training of our proposed SVM. The performances of our method have been extensively tested on the UCI machine learning repository, as well as a real clinical problem, i.e., tissue classification in prostate ultrasound images. Experimental results show that our method is able to simultaneously increase the classification efficiency and the generalization ability of the SVM.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call