Abstract

This paper studies the question of lower bounds on the number of neurons and examples necessary to program a given task into feedforward neural networks. We introduce the notion of information complexity of a network to complement that of neural complexity. Neural complexity deals with lower bounds for neural resources (numbers of neurons) needed by a network to perform a given task within a given tolerance. Information complexity measures lower bounds for the information (i.e. number of examples) needed about the desired input–output function. We study the interaction of the two complexities, and so lower bounds for the complexity of building and then programming feed-forward nets for given tasks. We show something unexpected a priori—the interaction of the two can be simply bounded, so that they can be studied essentially independently. We construct radial basis function (RBF) algorithms of order n 3 that are information-optimal, and give example applications.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.