Abstract

Using the replica technique we calculate the maximal possible difference between the learning and the generalization error of a perceptron learning a linearly separable Boolean classification from examples. We consider both spherical and Ising constraints on the couplings of the perceptron, investigate learnable as well as unlearnable problems and study the special situation where the class of perceptrons considered is restricted to the version space. The results are compared with the Vapnik-Chervonenkis bound and variants thereof. We find that these bounds are asymptotically tight within logarithmic corrections.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call