We study the circumradius of the intersection of an m m -dimensional ellipsoid E \mathcal E with semi-axes σ 1 ≥ ⋯ ≥ σ m \sigma _1\geq \dots \geq \sigma _m with random subspaces of codimension n n , where n n can be much smaller than m m . We find that, under certain assumptions on σ \sigma , this random radius R n = R n ( σ ) \mathcal {R}_n=\mathcal {R}_n(\sigma ) is of the same order as the minimal such radius σ n + 1 \sigma _{n+1} with high probability. In other situations R n \mathcal {R}_n is close to the maximum σ 1 \sigma _1 . The random variable R n \mathcal {R}_n naturally corresponds to the worst-case error of the best algorithm based on random information for L 2 L_2 -approximation of functions from a compactly embedded Hilbert space H H with unit ball E \mathcal E . In particular, σ k \sigma _k is the k k th largest singular value of the embedding H ↪ L 2 H\hookrightarrow L_2 . In this formulation, one can also consider the case m = ∞ m=\infty and we prove that random information behaves very differently depending on whether σ ∈ ℓ 2 \sigma \in \ell _2 or not. For σ ∉ ℓ 2 \sigma \notin \ell _2 we get E [ R n ] = σ 1 \mathbb {E}[\mathcal {R}_n] = \sigma _1 and random information is completely useless. For σ ∈ ℓ 2 \sigma \in \ell _2 the expected radius tends to zero at least at rate o ( 1 / n ) o(1/\sqrt {n}) as n → ∞ n\to \infty . In the important case \[ σ k ≍ k − α ln − β ( k + 1 ) , \sigma _k \asymp k^{-\alpha } \ln ^{-\beta }(k+1), \] where α > 0 \alpha > 0 and β ∈ R \beta \in \mathbb {R} (which corresponds to various Sobolev embeddings), we prove E [ R n ( σ ) ] ≍ { σ 1 a m p ; if α > 1 / 2 \, or \, β ≤ α = 1 / 2 , 2 m m σ n + 1 ln ( n + 1 ) a m p ; if β > α = 1 / 2 , 2 m m σ n + 1 a m p ; if α > 1 / 2. \begin{equation*} \mathbb E [\mathcal {R}_n(\sigma )] \asymp \left \{\begin {array}{cl} \sigma _1 & \text {if} \quad \alpha >1/2 \text {\, or \,} \beta \leq \alpha =1/2, {2mm} \\ \sigma _{n+1} \, \sqrt {\ln (n+1)} \quad & \text {if} \quad \beta >\alpha =1/2, {2mm} \\ \sigma _{n+1} & \text {if} \quad \alpha >1/2. \end{array}\right . \end{equation*} In the proofs we use a comparison result for Gaussian processes à la Gordon, exponential estimates for sums of chi-squared random variables, and estimates for the extreme singular values of (structured) Gaussian random matrices. The upper bound is constructive. It is proven for the worst case error of a least squares estimator.
Read full abstract