Abstract

The blocky optimization has gained a significant amount of attention in far-reaching practical applications. Following the recent work (M. Nikolova and P. Tan [SIAM J. Optim. 29 (2019), pp. 2053–2078]) on solving a class of nonconvex nonsmooth optimization, we develop a stochastic alternating structure-adapted proximal (s-ASAP) gradient descent method for solving blocky optimization problems. By deploying some state-of-the-art variance reduced gradient estimators (rather than full gradient) in stochastic optimization, the s-ASAP method is applicable to nonconvex optimization whose objective is the sum of a nonsmooth data-fitting term and a finite number of differentiable functions. The sublinear convergence rate of s-ASAP is built upon the proximal point algorithmic framework, whilst the linear convergence rate of s-ASAP is achieved under the error bound condition. Furthermore, the convergence of the sequence produced by s-ASAP is established under the Kurdyka-Łojasiewicz property. Preliminary numerical simulations on some image processing applications demonstrate the compelling performance of the proposed method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call