Abstract

We had previously shown that regularization principles lead to approximation schemes which are equivalent to networks with one layer of hidden units, called regularization networks. We summarize some recent results (Girosi, Jones and Poggio, 1993) that show that regularization networks encompass a much broader range of approximation schemes, including many of the popular general additive models and some of the neural networks. In particular, additive splines as well as some tensor product splines can be obtained from appropriate classes of smoothness functionals. Furthermore the same extension that extends radial basis functions to hyper basis functions leads from additive models to ridge approximation models, containing as special cases Breiman's hinge functions and some forms of projection pursuit regression. We propose to use the term generalized regularization networks for this broad class of approximation schemes that follow from an extension of regularization.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call