Abstract

An upper bound on pattern storage is stated for nonlinear feedforward networks with analytic activation functions, like the multilayer perceptron and radial basis function network. The bound is given in terms of the number of network weights, and applies to networks having any number of output nodes and arbitrary connectivity. Starting from the strict interpolation equations and exact finite degree polynomial models for the hidden units, a straightforward proof by contradiction is developed for the upper bound. Several networks, trained by conjugate gradient, are used to demonstrate the tightness of the bound for random patterns.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call