Abstract

Suppose {Xk}k∈Z is a sequence of bounded independent random matrices with common dimension d×d and common expectation E[Xk]=X. Under these general assumptions, the normalized random matrix productZn=(Id+1nXn)(Id+1nXn−1)⋯(Id+1nX1) converges to eX as n→∞. Normalized random matrix products of this form arise naturally in stochastic iterative algorithms, such as Oja's algorithm for streaming Principal Component Analysis. Here, we derive nonasymptotic concentration inequalities for such random matrix products. In particular, we show that the spectral norm error satisfies ‖Zn−eX‖=O((log⁡(n))2log⁡(d/δ)/n) with probability exceeding 1−δ. This rate is sharp in n, d, and δ, up to logarithmic factors. The proof relies on two key points of theory: the Matrix Bernstein inequality concerning the concentration of sums of random matrices, and Baranyai's theorem from combinatorial mathematics. Concentration bounds for general classes of random matrix products are hard to come by in the literature, and we hope that our result will inspire further work in this direction.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.