Abstract
We construct neural network interpolation operators with some newly defined activation functions, and give the approximation rate by the operators for continuous functions. By adding some smooth assumptions on the activation function, we establish two important inequalities of the derivative of the operators. With these two inequalities, by using K-functional and Berens–Lorentz lemma in approximation theory, we obtain the converse theorem of approximation by the operators. To approximate smooth functions, we further introduce special combinations of the operators, which can be regarded as FNNs with four layers, and can approximate the object function and its derivative simultaneously. Finally, we introduce a Kantorovich type variant of the operators. We establish both the direct theorem and the converse theorem of approximation by the Kantorovich type operators in Lp spaces with 1≤p≤∞.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.