Abstract
This chapter is dedicated to advanced probability results required in the more involved arguments on random measurement matrices, notably precise bounds for Gaussian random matrices and the analysis of random partial Fourier matrices. First, norms of Gaussian vectors in expectation are discussed, followed by Rademacher sums and the symmetrization principle. Khintchine inequalities provide bounds for moments of Rademacher sums. Then decoupling inequalities are covered. They allow one to simplify the analysis of double sums of random variables by replacing one instance of a random vector by an independent copy. The noncommutative Bernstein inequality treated next bounds the tail of a sum of independent random matrices in the operator norm. Dudley’s inequality bounds the supremum of a subgaussian process by an integral over covering numbers with respect to the index set of the process. Slepian’s and Gordon’s lemma compare certain functions of Gaussian vectors in terms of their covariance structures. Together with the concentration of measure principle for Lipschitz functions of Gaussian vectors covered next, they provide powerful tools for the analysis of Gaussian random matrices. Finally, the chapter discusses Talagrand’s inequality, i.e., a Bernstein-type inequality for suprema of empirical processes.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.