Abstract

The problem of minimizing the sum of nonsmooth, convex objective functions defined on a real Hilbert space over the intersection of fixed point sets of nonexpansive mappings, onto which the projections cannot be efficiently computed, is considered. The use of proximal point algorithms that use the proximity operators of the objective functions and incremental optimization techniques is proposed for solving the problem. With the focus on fixed point approximation techniques, two algorithms are devised for solving the problem. One blends an incremental subgradient method, which is a useful algorithm for nonsmooth convex optimization, with a Halpern-type fixed point iteration algorithm. The other is based on an incremental subgradient method and the Krasnosel’skiĭ–Mann fixed point algorithm. It is shown that any weak sequential cluster point of the sequence generated by the Halpern-type algorithm belongs to the solution set of the problem and that there exists a weak sequential cluster point of the sequence generated by the Krasnosel’skiĭ–Mann-type algorithm, which also belongs to the solution set. Numerical comparisons of the two proposed algorithms with existing subgradient methods for concrete nonsmooth convex optimization show that the proposed algorithms achieve faster convergence.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call