Abstract

Support vector machines (SVMs) have been successfully applied to classification problems. The difficulty in selecting the most effective error penalty has been partly resolved with /spl nu/-SVM. However, the use of uneven training class sizes, which occurs frequently with target detection problems, results in machines with biases towards the class with the larger training set. We propose an extended /spl nu/-SVM to counter the effects of the unbalanced training class sizes. The resulting dual /spl nu/-SVM provides the facility to counter these effects, as well as to adjust the error penalties of each class separately. The parameter /spl nu/ of each class provides a lower bound to the fraction of support vector of that class, and the upper bound to the fraction of bounded support vector of that class. These bounds allow the control on the error rates allowed for each class, and enable the training of machines with specific error rate requirements.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call