Abstract

In the context of big data analysis, stochastic optimization algorithms are widely used as effective tools to handle data complexity and data uncertainty. These algorithms usually aim to solve problems modeled as stochastic programs. Some of these problems admit nonconvex objective functions. On the other hand, DCA (difference of convex functions algorithm) has proven its strength in tackling a large class of smooth or nonsmooth, nonconvex optimization problems called DC programming. The key advantages of DCA come from its simplicity and flexibility that allows it to treat large-scale problems arising in various contexts. This chapter concerns methods incorporating ideas of stochastic optimization in an online manner into DCA framework to create new algorithms called online stochastic DCA. The first section introduces the chapter. The second section accounts for deterministic DC programming and DCA. The third section briefly reviews stochastic optimization. The fourth section is dedicated to stochastic DC programming and DCA, where we propose two online stochastic DCA schemes for solving a class of stochastic DC programs. The last section concludes the chapter with discussions about promising aspects of the topic.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call