Abstract

Poggio and Girosi showed that regularization principles lead to approximation schemes which are equivalent to networks with one layer of hidden units, called regularization networks. They summarize their results (1993) that show that regularization networks encompass a much broader range of approximation schemes, including many of the general additive models and some of the neural networks. In particular, additive splines as well as some tensor product splines can be obtained from appropriate classes of smoothness functionals. The same extension that extends radial basis functions to hyper basis functions leads from additive models to ridge approximation models, containing as special cases Breiman's hinge functions and some forms of projection pursuit regression. The authors propose to use the term generalized regularization networks for this broad class of approximation schemes that follow from an extension of regularization. >

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call