Abstract

Supported by the recent contributions in multiple domains, the first-order splitting became algorithms of choice for structured nonsmooth optimization. The large-scale noisy contexts make available stochastic information on the objective function and thus, the extension of proximal gradient schemes to stochastic oracles is heavily based on the tractability of the proximal operator corresponding to nonsmooth component, which has been highly exploited in the literature. However, some questions remained about the complexity of the composite models with proximal untractable terms. In this paper we tackle composite optimization problems, assuming only the access to stochastic information on both smooth and nonsmooth components, with a stochastic proximal first-order scheme with stochastic proximal updates. We provide sublinear \(\mathcal {O}\left( \frac{1}{k} \right) \) convergence rates (in expectation of squared distance to the optimal set) under the strong convexity assumption on the objective function. Also, linear convergence is achieved for convex feasibility problems. The empirical behavior is illustrated by numerical tests on parametric sparse representation models.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.