Abstract

Nonlinear support vector machines (SVMs) rely on the kernel trick and tradeoff parameters to build nonlinear models to classify complex problems and balance misclassification and generalization. The inconvenience in determining the kernel and the parameters has motivated the use of local nearest neighbor (NN) classifiers in lieu of global classifiers. This substitution ignores the advantage of SVM in global error minimization. On the other hand, the NN rule assumes that class conditional probabilities are locally constant. Such an assumption does not hold near class boundaries and in any high dimensional space due to the curse of dimensionality. We propose a hybrid classification method combining the global SVM and local NN classifiers. Local classifiers occur only when the global SVM is likely to fail. Furthermore, local NN classifiers adopt an adaptive metric driven by local SVM discriminative boundaries. Improved performance has been demonstrated compared to partially similar.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call