Abstract

The minimization of the empirical risk based on an arbitrary Bregman divergence is known to provide posterior class probability estimates in classification problems, but the accuracy of the estimate for a given value of the true posterior depends on the specific choice of the divergence. Ad hoc Bregman divergences can be designed to get a higher estimation accuracy for the posterior probability values that are most critical for a particular cost-sensitive classification scenario. Moreover, some sequences of Bregman loss functions can be constructed in such a way that their minimization guarantees, asymptotically, minimum number of errors in nonseparable cases, and maximum margin classifiers in separable problems. In this paper, we analyze general conditions on the Bregman generator to satisfy this property, and generalize the result for cost-sensitive classification.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call