Abstract

Several spectral fluctuation measures of random matrix theory (RMT) have been applied in the study of spectral properties of networks. However, the calculation of those statistics requires performing an unfolding procedure, which may not be an easy task. In this work, network spectra are interpreted as time series, and we show how their short and long-range correlations can be characterized without implementing any previous unfolding. In particular, we consider three different representations of Erdős–Rényi (ER) random networks: standard ER networks, ER networks with random-weighted self-edges, and fully random-weighted ER networks. In each case, we apply singular value decomposition (SVD) such that the spectra are decomposed in trend and fluctuation normal modes. We obtain that the fluctuation modes exhibit a clear crossover between the Poisson and the Gaussian orthogonal ensemble statistics when the average degree of ER networks changes. Moreover, by using the trend modes, we perform a data-adaptive unfolding to calculate, for comparison purposes, traditional fluctuation measures such as the nearest neighbor spacing distribution, number variance Σ2, as well as Δ3 and δn statistics. The thorough comparison of RMT short and long-range correlation measures make us identify the SVD method as a robust tool for characterizing random network spectra.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call