Abstract

In the present paper we establish estimates for the error of approximation (in the Lp-norm) achieved by neural network (NN) operators. The above estimates have been given by means of an averaged modulus of smoothness introduced by Sendov and Popov, also known with the name of τ-modulus, in case of bounded and measurable functions on the interval [−1,1]. As a consequence of the above estimates, we can deduce an Lp convergence theorem for the above family of NN operators in case of functions which are bounded, measurable, and Riemann integrable on the above interval. In order to reach the above aims, we preliminarily establish a number of results; among them we can mention an estimate for the p-norm of the operators, and an asymptotic type theorem for the NN operators in case of functions belonging to Sobolev spaces.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call