Abstract

In this paper, we show that the largest and smallest eigenvalues of a sample correlation matrix stemming from n independent observations of a p-dimensional time series with iid components converge almost surely to (1+γ)2 and (1−γ)2, respectively, as n→∞, if p∕n→γ∈(0,1] and the truncated variance of the entry distribution is “almost slowly varying”, a condition we describe via moment properties of self-normalized sums. Moreover, the empirical spectral distributions of these sample correlation matrices converge weakly, with probability 1, to the Marčenko–Pastur law, which extends a result in Bai and Zhou (2008). We compare the behavior of the eigenvalues of the sample covariance and sample correlation matrices and argue that the latter seems more robust, in particular in the case of infinite fourth moment. We briefly address some practical issues for the estimation of extreme eigenvalues in a simulation study.In our proofs we use the method of moments combined with a Path-Shortening Algorithm, which efficiently uses the structure of sample correlation matrices, to calculate precise bounds for matrix norms. We believe that this new approach could be of further use in random matrix theory.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call