Abstract

The $\gamma_2$ norm of a real $m\times n$ matrix $A$ is the minimum number $t$ such that the column vectors of $A$ are contained in a $0$-centered ellipsoid $E\subseteq\mathbb{R}^m$ which in turn is contained in the hypercube $[-t, t]^m$. We prove that this classical quantity approximates the \emph{hereditary discrepancy} $\mathrm{herdisc} A$ as follows: $\gamma_2(A) = {O(\log m)}\cdot \mathrm{herdisc} A$ and $\mathrm{herdisc} A = O(\sqrt{\log m}\,)\cdot\gamma_2(A) $. Since $\gamma_2$ is polynomial-time computable, this gives a polynomial-time approximation algorithm for hereditary discrepancy. Both inequalities are shown to be asymptotically tight. We then demonstrate on several examples the power of the $\gamma_2$ norm as a tool for proving lower and upper bounds in discrepancy theory. Most notably, we prove a new lower bound of $\Omega(\log^{d-1} n)$ for the \emph{$d$-dimensional Tusnady problem}, asking for the combinatorial discrepancy of an $n$-point set in $\mathbb{R}^d$ with respect to axis-parallel boxes. For $d>2$, this improves the previous best lower bound, which was of order approximately $\log^{(d-1)/2}n$, and it comes close to the best known upper bound of $O(\log^{d+1/2}n)$, for which we also obtain a new, very simple proof.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.