Abstract
As we know, feedforward neural networks (FNNs) with sigmoidal activation function are universal approximators. Theoretically, for any continuous function defined on a compact set, there exists an FNN such that the FNN can approximate the function with arbitrary accuracy, which constitutes a theoretical guarantee that FNN can be used as an efficient learning machine. This paper addresses the construction and approximation for FNNs. We construct an FNN with sigmoidal activation function and estimate its approximation error. In particular, an inverse theorem of the approximation is established, which implies the equivalence characterization theorem of the approximation and reveals the relationship between the topological structure of the FNN and its approximation ability. As keys in this study, the concepts of modulus of continuity of function, K-functional, and their relationship are utilized, and two Bernstein-type inequalities are established.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal of Wavelets, Multiresolution and Information Processing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.