Abstract

In this paper, we introduce some neural network interpolation operators activated by smooth ramp functions. By using the smoothness of the ramp functions, we can give some useful estimates of the derivatives of the neural networks, which combining with some techniques in approximation theory enable us to establish the converse estimates of approximation by neural networks. We establish both the direct and the converse results of approximation by the new neural network operators defined by us, and thus give the essential approximation rate. To improve the approximation rate for functions of smoothness, we further introduce linear combinations of the new operators. The new combinations interpolate the objective function and its derivative. We also estimate the uniform convergence rate and simultaneous approximation rate by the new combinations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call