Abstract

This article deals with the determination of the rate of convergence to the unit of each of three newly introduced here perturbed normalized neural network operators of one hidden layer. These are given through the modulus of continuity of the involved function or its high order derivative and that appears in the right-hand side of the associated Jackson type inequalities. The activation function is very general, especially it can derive from any sigmoid or bell-shaped function. The right hand sides of our convergence inequalities do not depend on the activation function. The sample functionals are of Stancu, Kantorovich and Quadrature types. We give applications for the rst derivative of the involved function. 2010 AMSMathematics Subject Classi cation: 41A17, 41A25, 41A30, 41A36.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call