We consider two matrix-valued data sets that are modeled as low-rank-correlated-signal-plus-Gaussian noise. When empirical canonical correlation analysis (CCA) is used to infer these latent correlations, there is a broad regime, where this inference will fail, which was classified by Bao and collaborators in the limit of high dimensionality and sample size. This regime includes the setting, previously considered by Pezeshki and collaborators, where the sample size is less than the combined dimensionality of the data sets. We revisit this detection problem by first observing that the empirically estimated canonical correlation coefficients are the singular values of the inner products between the right singular vectors of the two data sets. Motivated by random matrix theory insights, we propose an algorithm, which we label informative CCA (ICCA), that infers the presence of latent correlations by considering the singular values of only the informative right singular vectors of each data set. We establish fundamental detection limits for ICCA and show that it dramatically outperforms empirical CCA in broad regimes, where empirical CCA provably fails. We extend our theoretical analysis to the setting, where the data sets have randomly missing data and for more general noise models. Finally, we validate our theoretical results with numerical simulations and a real-world experiment.