Abstract
We establish in this work approximation results of deep neural networks for smooth functions measured in Sobolev norms, motivated by recent development of numerical solvers for partial differential equations using deep neural networks. Our approximation results are nonasymptotic in the sense that the error bounds are explicitly characterized in terms of both the width and depth of the networks simultaneously with all involved constants explicitly determined. Namely, for f∈Cs([0,1]d), we show that deep ReLU networks of width O(NlogN) and of depth O(LlogL) can achieve a nonasymptotic approximation rate of O(N−2(s−1)/dL−2(s−1)/d) with respect to the W1,p([0,1]d) norm for p∈[1,∞). If either the ReLU function or its square is applied as activation functions to construct deep neural networks of width O(NlogN) and of depth O(LlogL) to approximate f∈Cs([0,1]d), the approximation rate is O(N−2(s−n)/dL−2(s−n)/d) with respect to the Wn,p([0,1]d) norm for p∈[1,∞).
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.