Abstract

We introduce a novel framework for an approximate recovery of data matrices which are low rank on graphs, from sampled measurements. The rows and columns of such matrices belong to the span of the first few eigenvectors of the graphs constructed between their rows and columns. We leverage this property to recover the nonlinear low-rank structures efficiently from sampled data measurements, with a low cost (linear in $n$ ). First, a resrtricted isometry property condition is introduced for efficient uniform sampling of the rows and columns of such matrices based on the cumulative coherence of graph eigenvectors. Second, a state-of-the-art fast low-rank recovery method is suggested for the sampled data. Finally, several efficient, parallel, and parameter-free decoders are presented along with their theoretical analysis for decoding the low-rank and cluster indicators for the full data matrix. Thus, we overcome the computational limitations of the standard linear low-rank recovery methods for big datasets. Our method can also be seen as a major step toward efficient recovery of nonlinear low-rank structures. For a matrix of size $n \times p$ , on a single core machine, our method gains a speed up of $p^2/k$ over robust principal component analysis (RPCA), where $k \ll p$ is the subspace dimension. Numerically, we can recover a low-rank matrix of size $10304 \times 1000$ , 100 times faster than RPCA.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call