Abstract

This paper is devoted to a new modification of a recently proposed adaptive stochastic mirror descent algorithm for constrained convex optimization problems in the case of several convex functional constraints. Algorithms, standard and its proposed modification, are considered for the type of problems with non-smooth Lipschitz-continuous convex objective function and convex functional constraints. Both algorithms, with an accuracy \(\varepsilon \) of the approximate solution to the problem, are optimal in the terms of lower bounds of estimates and have the complexity \(O\left( \varepsilon ^{-2} \right) \). In both algorithms, the precise first-order information, which connected with (sub)gradient of the objective function and functional constraints, is replaced with its unbiased stochastic estimates. This means that in each iteration, we can still use the value of the objective function and functional constraints at the research point, but instead of their (sub)gradient, we calculate their stochastic (sub)gradient. Due to the consideration of not all functional constraints on non-productive steps, the proposed modification allows saving the running time of the algorithm. Estimates for the rate of convergence of the proposed modified algorithm is obtained. The results of numerical experiments demonstrating the advantages and the efficient of the proposed modification for some examples are also given.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.