Abstract

In the framework of agnostic learning, one of the main open problems of the theory of multi-category pattern classification is the characterization of the way the confidence interval of a guaranteed risk should vary as a function of the fundamental parameters which are the sample size m and the number C of categories. This is especially the case when working under minimal learnability hypotheses. We consider margin classifiers based on classes of vector-valued functions with one component function per category. The classes of component functions are uniform Glivenko-Cantelli and the vector-valued functions take their values in a hypercube of RC. For these classifiers, a well-known guaranteed risk based on a Rademacher complexity applies. Several studies have dealt with the derivation of an upper bound on this complexity. This article establishes a bound which is based on a new generalized Sauer-Shelah lemma. Under the additional assumption that the γ-dimensions of the classes of component functions grow no faster than polynomially with γ−1, its growth rate with C is a O (√C In (C)). This behaviour holds true irrespective of the degree of the polynomial.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call