Abstract

For artificial deep neural networks, we prove expression rates for analytic functions in the norm of where . Here denotes the Gaussian product probability measure on . We consider in particular and activations for integer . For , we show exponential convergence rates in . In case , under suitable smoothness and sparsity assumptions on , with denoting an infinite (Gaussian) product measure on , we prove dimension-independent expression rate bounds in the norm of . The rates only depend on quantified holomorphy of (an analytic continuation of) the map to a product of strips in (in for , respectively). As an application, we prove expression rate bounds of deep -NNs for response surfaces of elliptic PDEs with log-Gaussian random field inputs.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.