Abstract

In this letter, we present a sharp algorithmic analysis for alternating projected gradient descent which is used to solve the covariate adjusted precision matrix estimation problem in high-dimensional settings. By introducing a new analytical tool (the generic chaining), we remove the impractical resampling assumption used in the literature. The new analysis also demonstrates that this algorithm not only enjoys a linear convergence rate in the absence of convexity, but also attains the minimax rate with optimal order of sample complexity. Our results, meanwhile, reveal a time-data tradeoff in this problem. Numerical experiments are provided to verify our theoretical results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call