Abstract

We construct neural network interpolation operators with some newly defined activation functions, and give the approximation rate by the operators for continuous functions. By adding some smooth assumptions on the activation function, we establish two important inequalities of the derivative of the operators. With these two inequalities, by using K-functional and Berens–Lorentz lemma in approximation theory, we obtain the converse theorem of approximation by the operators. To approximate smooth functions, we further introduce special combinations of the operators, which can be regarded as FNNs with four layers, and can approximate the object function and its derivative simultaneously. Finally, we introduce a Kantorovich type variant of the operators. We establish both the direct theorem and the converse theorem of approximation by the Kantorovich type operators in Lp spaces with 1≤p≤∞.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call