Abstract

In this paper we consider a class of structured nonsmooth difference-of-convex (DC) minimization in which the first convex component is the sum of a smooth and nonsmooth functions while the second convex component is the supremum of possibly infinitely many convex smooth functions. We first propose an inexact enhanced DC algorithm for solving this problem in which the second convex component is the supremum of finitely many convex smooth functions, and show that every accumulation point of the generated sequence is an $$(\alpha ,\eta )$$ -D-stationary point of the problem, which is generally stronger than an ordinary D-stationary point. In addition, inspired by the recent work (Pang et al. in Math Oper Res 42(1):95–118, 2017; Wen et al. in Comput Optim Appl 69(2):297–324, 2018), we propose two proximal DC algorithms with extrapolation for solving this problem. We show that every accumulation point of the solution sequence generated by them is an $$(\alpha ,\eta )$$ -D-stationary point of the problem, and establish the convergence of the entire sequence under some suitable assumption. We also introduce a concept of approximate $$(\alpha ,\eta )$$ -D-stationary point and derive iteration complexity of the proposed algorithms for finding an approximate $$(\alpha ,\eta )$$ -D-stationary point. In contrast with the DC algorithm (Pang et al. 2017), our proximal DC algorithms have much simpler subproblems and also incorporate the extrapolation for possible acceleration. Moreover, one of our proximal DC algorithms is potentially applicable to the DC problem in which the second convex component is the supremum of infinitely many convex smooth functions. In addition, our algorithms have stronger convergence results than the proximal DC algorithm in Wen et al. (2018).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call