Abstract

For a parameter dimension din {mathbb {N}}, we consider the approximation of many-parametric maps u: [-,1,1]^drightarrow {mathbb R} by deep ReLU neural networks. The input dimension d may possibly be large, and we assume quantitative control of the domain of holomorphy of u: i.e., u admits a holomorphic extension to a Bernstein polyellipse {{mathcal {E}}}_{rho _1}times cdots times {{mathcal {E}}}_{rho _d} subset {mathbb {C}}^d of semiaxis sums rho _i>1 containing [-,1,1]^{d}. We establish the exponential rate O(exp (-,bN^{1/(d+1)})) of expressive power in terms of the total NN size N and of the input dimension d of the ReLU NN in W^{1,infty }([-,1,1]^d). The constant b>0 depends on (rho _j)_{j=1}^d which characterizes the coordinate-wise sizes of the Bernstein-ellipses for u. We also prove exponential convergence in stronger norms for the approximation by DNNs with more regular, so-called “rectified power unit” activations. Finally, we extend DNN expression rate bounds also to two classes of non-holomorphic functions, in particular to d-variate, Gevrey-regular functions, and, by composition, to certain multivariate probability distribution functions with Lipschitz marginals.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call