Abstract

We consider several complexity measures which capture the difficulty of learning under the i.i.d. assumption. Among these measures are growth function and VC dimension , covering number and fat-shattering dimension, and Rademacher complexity from statistical learning theory. Relationships among these complexity measures, their connection to learning, and tools for bounding them are provided. For each complexity measure, a uniform upper bound on the generalization error of classification problems is presented.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call