Abstract
Existing maximum-margin support vector machines (SVMs) generate a hyperplane which produces the clearest separation between positive and negative feature vectors. These SVMs are effective when datasets are large. However, when few training samples are available, the hyperplane is easily influenced by outliers that are geometrically located in the opposite class. We propose a modified SVM which weights feature vectors to reflect the local density of support vectors and quantifies classification uncertainty in terms of the local classification capability of each training sample. We derive a primal formulation of an SVM that incorporates those modifications, and implement an RC-margin SVM of the simplest form. We evaluated our model on the recognition of handwritten numerals, and obtained higher recognition rate than a standard maximum-margin SVM, a weighted SVM, or an SVM which accounts for classification uncertainty.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have