Abstract

OF THE DISSERTATION Optimal upper bound for the infinity norm of eigenvectors of random matrices by Ke Wang Dissertation Director: Professor Van Vu Let Mn be a random Hermitian (or symmetric) matrix whose upper diagonal and diagonal entries are independent random variables with mean zero and variance one. It is well known that the empirical spectral distribution (ESD) converges in probability to the semicircle law supported on [−2, 2]. In this thesis we study the local convergence of ESD to the semicircle law. One main result is that if the entries of Mn are bounded, then the semicircle law holds on intervals of scale log n/n. As a consequence, we obtain the delocalization result for the eigenvectors, i.e., the upper bound for the infinity norm of unit eigenvectors corresponding to eigenvalues in the bulk of spectrum, is O( √ log n/n). The bound is the same as the infinity norm of a vector chosen uniformly on the unit sphere in Rn. We also study the local version of Marchenko-Pastur law for random covariance matrices and obtain the optimal upper bound for the infinity norm of singular vectors. This is joint work with V. Vu. In the last chapter, we discuss the delocalization properties for the adjacency matrices of Erdős-Renyi random graph. This is part of some earlier results joint with L. Tran and V. Vu.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call