Abstract

Networks can be considered as approximation schemes. Multilayer networks of the perceptron type can approximate arbitrarily well continuous functions (Cybenko 1988, 1989; Funahashi 1989; Stinchcombe and White 1989). We prove that networks derived from regularization theory and including Radial Basis Functions (Poggio and Girosi 1989), have a similar property. From the point of view of approximation theory, however, the property of approximating continuous functions arbitrarily well is not sufficient for characterizing good approximation schemes. More critical is the property ofbest approximation. The main result of this paper is that multilayer perceptron networks, of the type used in backpropagation, do not have the best approximation property. For regularization networks (in particular Radial Basis Function networks) we prove existence and uniqueness of best approximation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call