Abstract
AbstractIn this paper, a new family of neural network (NN) operators is introduced. The idea is to consider a Durrmeyer-type version of the widely studied discrete NN operators by Costarelli and Spigler (Neural Netw 44:101–106, 2013). Such operators are constructed using special density functions generated from suitable sigmoidal functions, while the reconstruction coefficients are based on a convolution between a general kernel function $$\chi $$ χ and the function being reconstructed, f. Here, we investigate their approximation capabilities, establishing both pointwise and uniform convergence theorems for continuous functions. We also provide quantitative estimates for the approximation order thanks to the use of the modulus of continuity of f; this turns out to be strongly influenced by the asymptotic behaviour of the sigmoidal function $$\sigma $$ σ . Our study also shows that the estimates we provide are, under suitable assumptions, the best possible. Finally, $$L^p$$ L p -approximation is also established. At the end of the paper, examples of activation functions are discussed.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.