Abstract

We present a fairly general method for constructing classes of functions of finite scale-sensitive dimension (the scale-sensitive dimension is a generalization of the Vapnik–Chervonenkis dimension to real-valued functions). The construction is as follows: start from a classFof functions of finite VC dimension, take the convex hull coFofF, and then take the closurecoFof coFin an appropriate sense. As an example, we study in more detail the case whereFis the class of threshold functions. It is shown thatcoFincludes two important classes of functions: •neural networks with one hidden layer and bounded output weights; •the so-calledΓclass of Barron, which was shown to satisfy a number of interesting approximation and closure properties. We also give an integral representation in the form of a “continuous neural network” which generalizes Barron's. It is shown that the existence of an integral representation is equivalent to bothL2andL∞approximability. A preliminary version of this paper was presented at EuroCOLT'95. The main difference with the conference version is the addition of Theorem 7, where we show that a key topological result fails when the VC dimension hypothesis is removed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call