Abstract
We consider the following detection problem: given a realization of a symmetric matrix X of dimension $n$ , distinguish between the hypothesis that all upper triangular variables are independent and identically distributed (i.i.d). Gaussians variables with mean 0 and variance 1 and the hypothesis, where X is the sum of such matrix and an independent rank-one perturbation. This setup applies to the situation, where under the alternative, there is a planted principal submatrix B of size $L$ for which all upper triangular variables are i.i.d. Gaussians with mean 1 and variance 1, whereas all other upper triangular elements of X not in B are i.i.d. Gaussians variables with mean 0 and variance 1. We refer to this as the “Gaussian hidden clique problem.” When $L=(1+\epsilon )\sqrt {n}$ ( $\epsilon >0$ ), it is possible to solve this detection problem with probability $1-o_{n}(1)$ by computing the spectrum of X and considering the largest eigenvalue of X. We prove that this condition is tight in the following sense: when $L no algorithm that examines only the eigenvalues of X can detect the existence of a hidden Gaussian clique, with error probability vanishing as $n\to \infty $ . We prove this result as an immediate consequence of a more general result on rank-one perturbations of $k$ -dimensional Gaussian tensors. In this context, we establish a lower bound on the critical signal-to-noise ratio below which a rank-one signal cannot be detected.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.