Abstract
Very sparse random graphs are known to typically be singular (i.e., have singular adjacency matrix) due to the presence of “low-degree dependencies” such as isolated vertices and pairs of degree 1 vertices with the same neighborhood. We prove that these kinds of dependencies are in some sense the only causes of singularity: for constants k≥3 and λ>0, an Erdős–Rényi random graph G∼G(n,λ∕n) with n vertices and edge probability λ∕n typically has the property that its k-core (its largest subgraph with minimum degree at least k) is nonsingular. This resolves a conjecture of Vu from the 2014 International Congress of Mathematicians, and adds to a short list of known nonsingularity theorems for “extremely sparse” random matrices with density O(1∕n). A key aspect of our proof is a technique to extract high-degree vertices and use them to “boost” the rank, starting from approximate rank bounds obtainable from (nonquantitative) spectral convergence machinery due to Bordenave, Lelarge, and Salez.
Submitted Version (
Free)
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have