Abstract
We investigate rates of approximation of multivariable functions by one-hidden-layer neural networks with a general hidden unit function. Under mild assumptions on hidden unit function we derive upper bounds on rates of approximation (measured by both the number of hidden units and the size of parameters) in terms of various norms of the function to be approximated and its higher order moduli of continuity.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have