Abstract

We consider convex optimization problems with a possibly nonsmooth objective function in the form of a mathematical expectation. The proposed framework (AN-SPS) employs Sample Average Approximations (SAA) to approximate the objective function, which is either unavailable or too costly to compute. The sample size is chosen in an adaptive manner, which eventually pushes the SAA error to zero almost surely (a.s.). The search direction is based on a scaled subgradient and a spectral coefficient, both related to the SAA function. The step size is obtained via a nonmonotone line search over a predefined interval, which yields a theoretically sound and practically efficient algorithm. The method retains feasibility by projecting the resulting points onto a feasible set. The a.s. convergence of AN-SPS method is proved without the assumption of a bounded feasible set or bounded iterates. Preliminary numerical results on Hinge loss problems reveal the advantages of the proposed adaptive scheme. In addition, a study of different nonmonotone line search strategies in combination with different spectral coefficients within AN-SPS framework is also conducted, yielding some hints for future work.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call