Abstract

We consider the problem of approximating a smooth target function and its derivatives by networks involving superpositions and translations of a fixed activation function. The approximation is with respect to the sup-norm and the rate is shown to be of order O(n/sup -1/2/); that is, the rate is independent of the dimension d. The results apply to neural and wavelet networks and extend the work of Barren(see Proc. 7th Yale Workshop on Adaptive and Learning Systems, May, 1992, and ibid., vol.39, p.930, 1993). The approach involves probabilistic methods based on central limit theorems for empirical processes indexed by classes of functions.< <ETX xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">&gt;</ETX>

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.