Abstract

We study the convergence of the iterative projected gradient (IPG) algorithm for arbitrary (possibly non-convex) sets when both the gradient and projection oracles are computed approximately. We consider different notions of approximation of which we show that the progressive fixed precision and the $(1+ \varepsilon)$ -optimal oracles can achieve the same accuracy as for the exact IPG algorithm. We show that the former scheme is also able to maintain the (linear) rate of convergence of the exact algorithm under the same embedding assumption. In contrast, the $(1+ \varepsilon)$ -approximate oracle requires a stronger embedding condition, moderate compression ratios and it typically slows down the convergence. We apply our results to accelerate solving a class of data driven compressed sensing problems, where we replace iterative exhaustive searches over large data sets by fast approximate nearest neighbor search strategies based on the cover tree data structure. For data sets with low intrinsic dimensions, our proposed algorithm achieves a complexity logarithmic in terms of the data set population as opposed to the linear complexity of a brute force search. By running several numerical experiments, we conclude similar observations as predicted by our theoretical analysis.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call