We prove lower bounds on the worst-case error of numerical integration in tensor product spaces. The information complexity is the minimal number N of function evaluations that is necessary such that the N-th minimal error is less than a factor ε times the initial error, i.e., the error for N=0, where ε belongs to (0,1). We are interested to which extent the information complexity depends on the number d of variables of the integrands. If the information complexity grows exponentially fast in d, then the integration problem is said to suffer from the curse of dimensionality.Under the assumption of the existence of a worst-case function for the uni-variate problem, we present two methods for providing lower bounds on the information complexity. The first method is based on a suitable decomposition of the worst-case function and can be seen as a generalization of the method of decomposable reproducing kernels. The second method, although only applicable for positive quadrature rules, does not require a suitable decomposition of the worst-case function. Rather, it is based on a spline approximation of the worst-case function and can be used for analytic functions. Several applications of both methods are presented.
Read full abstract