Abstract

Two-stage stochastic optimization, in which a long-term master problem is coupled with a family of short-term subproblems, plays a critical role in various application areas. However, most existing algorithms for two-stage stochastic optimization only work for special cases, and/or are based on the batch method, which requires huge memory and computational complexity. To the best of our knowledge, there still lack efficient and general two-stage online stochastic optimization algorithms. This paper proposes a two-stage online successive convex approximation (TOSCA) algorithm for general two-stage nonconvex stochastic optimization problems. At each iteration, the TOSCA algorithm first solves one short-term subproblem associated with the current realization of the system state. Then, it constructs a convex surrogate function for the objective of the long-term master problem. Finally, the long-term variables are updated by solving a convex approximation problem obtained by replacing the objective function in the long-term master problem with the convex surrogate function. We establish the almost sure convergence of the TOSCA algorithm and customize the algorithmic framework to solve three important application problems. Simulations show that the TOSCA algorithm can achieve superior performance over existing solutions.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.