Abstract

We consider the problem of robust mean and location estimation with respect to any pseudo-norm of the form\(x\in {{\mathbb {R}}}^d\mapsto \left\| x\right\| _S = \sup _{v\in S}\bigl < v,x \bigr>\) where S is any symmetric subset of \({{\mathbb {R}}}^d\). We show that the deviation-optimal minimax sub-Gaussian rate for confidence \(1-\delta \) is $$\begin{aligned} \max \left( \frac{\ell ^*(\Sigma ^{1/2}S)}{\sqrt{N}}, \sup _{v\in S}\left\| \Sigma ^{1/2}v\right\| _2\sqrt{\frac{\log (1/\delta )}{N}}\right) \end{aligned}$$where \(\ell ^*(\Sigma ^{1/2}S)\) is the Gaussian mean width of \(\Sigma ^{1/2}S\) and \(\Sigma \) the covariance of the data. This improves the entropic minimax lower bound from Lugosi and Mendelson (Probab Theory Relat Fields 175(3–4):957–973, 2019) and closes the gap characterized by Sudakov’s inequality between the entropy and the Gaussian mean width for this problem. This shows that the right statistical complexity measure for the mean estimation problem is the Gaussian mean width. We also show that this rate can be achieved by a solution to a convex optimization problem in the adversarial and \(L_2\) heavy-tailed setup by considering minimum of some Fenchel–Legendre transforms constructed using the median-of-means principle. We finally show that this rate may also be achieved in situations where there is not even a first moment but a location parameter exists.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call