Abstract

We present in this paper two different classes of general multiple-splitting algorithms for solving finite-dimensional convex optimization problems. Under the assumption that the function being minimized can be written as the sum of $K$ convex functions, each of which has a Lipschitz continuous gradient, we prove that the number of iterations needed by the first class of algorithms to obtain an $\epsilon$-optimal solution is $O((K-1)L/\epsilon)$, where $L$ is an upper bound on all of the Lipschitz constants. The algorithms in the second class are accelerated versions of those in the first class, where the complexity result is improved to $O(\sqrt{(K-1)L/\epsilon})$ while the computational effort required at each iteration is almost unchanged. To the best of our knowledge, the complexity results presented in this paper are the first ones of this type that have been given for splitting and alternating direction-type methods. Moreover, all algorithms proposed in this paper are parallelizable, which makes the...

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call