Abstract

Let $X_N$ be a symmetric $N\times N$ random matrix whose $\sqrt{N}$-scaled centered entries are uniformly square integrable. We prove that if the entries of $X_N$ can be partitioned into independent subsets each of size $o(\log N)$, then the empirical eigenvalue distribution of $X_N$ converges weakly to its mean in probability. This significantly extends the best previously known results on convergence of eigenvalues for matrices with correlated entries (where the partition subsets are blocks and of size $O(1)$.) we prove this result be developing a new log-Sobolev inequality, generalizing the first author's introduction of mollified log-Sobolev inequalities: we show that if $\mathbf{Y}$ is a bounded random vector and $\mathbf{Z}$ is a standard normal random vector independent from $\mathbf{Y}$, then the law of $\mathbf{Y}+t\mathbf{Z}$ satisfies a log-Sobolev inequality for all $t>0$, and we give bounds on the optimal log-Sobolev constant.

Highlights

  • Random matrix theory is primarily interested in the convergence of statistics associated to the eigenvalues of N × N matrices whose entries are random variables with a prescribed joint distribution

  • There is a big industry of literature devoted to necessary and sufficient conditions for a log-Sobolev inequality to hold; cf. [8, 9, 14, 27, 31]. Adding to these efforts, the second author of the present paper developed a new approximation scheme, the mollified log-Sobolev inequality, in [47]: if Y is any bounded random variable and Z is a standard normal random variable independent from Y, the law of Y + t1/2Z satisfies a log-Sobolev inequality for all t > 0, with a constant c(t) that is bounded in terms of an exponential of Y − E[Y ] 2∞/t

  • A dimension independent bound of this form would not improve our result in Theorem 1.1. It is the exponential dependence of the constant on Y − E[Y] L∞ that forces the blocks to be of size o(log N); and this dependence is sharp, as was shown in the second author’s paper [47, Theorem 15]

Read more

Summary

Introduction

Random matrix theory is primarily interested in the convergence of statistics associated to the eigenvalues (or singular values) of N × N matrices whose entries are random variables with a prescribed joint distribution. We use similar techniques to those used in the proof of Theorem 1.1 to prove the following stronger result in the case of Gaussian entries: under the appropriate uniform integrability conditions, the convergence of the ESD is almost sure, and guaranteed for blocks of much larger size. If the ξi j are bounded random variables, or if the common law of the entries ξi j satisfies a log-Sobolev inequality (cf (1.4) below), the convergence is almost sure. A dimension independent bound of this form would not improve our result in Theorem 1.1 It is the exponential dependence of the constant on Y − E[Y] L∞ that forces the blocks to be of size o(log N); and this dependence is sharp, as was shown in the second author’s paper [47, Theorem 15].

Concentration Results for Ensembles with Correlated Entries
Guionnet’s Approach to Wigner’s Law
Mollified Log-Sobolev Inequalities on Rd
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call