Abstract

In this work we investigate the practicality of stochastic gradient descent and recently introduced variants with variance-reduction techniques in imaging inverse problems. Such algorithms have been shown in the machine learning literature to have optimal complexities in theory, and provide great improvement empirically over the deterministic gradient methods. Surprisingly, in some tasks such as image deblurring, many of such methods fail to converge faster than the accelerated deterministic gradient methods, even in terms of epoch counts. We investigate this phenomenon and propose a theory-inspired mechanism for the practitioners to efficiently characterize whether it is beneficial for an inverse problem to be solved by stochastic optimization techniques or not. Using standard tools in numerical linear algebra, we derive conditions on the spectral structure of the inverse problem for being a suitable application of stochastic gradient methods. Particularly, we show that, for an imaging inverse problem, if and only if its Hessain matrix has a fast-decaying eigenspectrum, then the stochastic gradient methods can be more advantageous than deterministic methods for solving such a problem. Our results also provide guidance on choosing appropriately the partition minibatch schemes, showing that a good minibatch scheme typically has relatively low correlation within each of the minibatches. Finally, we propose an accelerated primal-dual SGD algorithm in order to tackle another key bottleneck of stochastic optimization which is the heavy computation of proximal operators. The proposed method has fast convergence rate in practice, and is able to efficiently handle non-smooth regularization terms which are coupled with linear operators.

Highlights

  • Stochastic gradient-based optimization algorithms have been ubiquitous in real world applications which involve solving large-scale and high-dimensional optimization tasks, in the field of machine learning [11], due to their scalability to the size of the optimization problems

  • We provide a novel analysis for the estimation-error convergence rate of minibatch proximal SGD in solving linear inverse problems with regularization constraints, under expected smoothness [26] and restricted strong-convexity [3, 49] condition

  • Based on our theoretical analysis, we propose to evaluate the limit of possible acceleration of a stochastic gradient method over its full gradient counterpart by measuring the Stochastic Acceleration (SA) factors which are based on the ratio of the Lipschitz constants of the minibatched stochastic gradient and the full gradient

Read more

Summary

Introduction

Stochastic gradient-based optimization algorithms have been ubiquitous in real world applications which involve solving large-scale and high-dimensional optimization tasks, in the field of machine learning [11], due to their scalability to the size of the optimization problems. We have provided a preliminary motivational analysis, which demonstrates that the speedup of stochastic gradient methods (with data-partition minibatches) over their deterministic counterparts in terms of objective gap convergence are at the worst case controlled by the ratio of Lipschitz constants of the stochastic gradient and full gradient, for the case of unregularized smooth optimization Such analysis, motivational, is restrictive in some aspects: in imaging inverse problems we usually consider non-smooth regularization, and we are more concerned with the convergence rates of optimization algorithms regarding estimation error. There has been recent progress [25, 24] identifying near optimal step-size choices for minibatch stochastic gradient descent and SAGA algorithms for minimizing strongly convex and smooth objective functions These existing results cannot be directly applied in inverse problems, mainly due to the following reasons:.

15: Return xt
Findings
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call