Abstract

Bounds for the generalization ability of neural networks based on Vapnik-Chervonenkis (VC) theory are compared with statistical mechanics results for the case of the parity machine. For fixed phase space dimension, the VC dimension can grows arbitrarily by increasing the number K of hidden units. Generalization is impossible up to a critical number of training examples that grows with the VC dimension. The asymptotic decrease of the generalization error ${\mathrm{\ensuremath{\varepsilon}}}_{\mathit{G}}$ comes out independent of K and the VC bounds strongly overestimate ${\mathrm{\ensuremath{\varepsilon}}}_{\mathit{G}}$. This shows that phase space dimension and VC dimension can play independent and different roles for the generalization process.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call