Abstract

Low-rank matrix regression refers to the instances of recovering a low-rank matrix based on specially designed measurements and the corresponding noisy outcomes. Numerous statistical methods have been developed over the recent decade for efficiently reconstructing the unknown low-rank matrices. It is often interesting, in certain applications, to estimate the unknown singular subspaces. In this paper, we revisit the low-rank matrix regression model and introduce a two-step procedure to construct confidence regions of the singular subspaces. We investigate distributions of the joint projection distance between the empirical singular subspaces and the unknown true singular subspaces. We prove asymptotical normality of the joint projection distance with data-dependent centering and normalization when $r^{3/2}(m_{1}+m_{2})^{3/2}=o(n/\log n)$ where $m_{1}, m_{2}$ denote the matrix row and column sizes, $r$ is the rank and $n$ is the number of independent random measurements. Consequently, data-dependent confidence regions of the true singular subspaces are established which attain pre-determined confidence levels asymptotically. Additionally, non-asymptotic convergence rates are also established. Numerical results are presented to show the merits of our methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call