Abstract

Relying on fixed point techniques, Mahey, Oualibouch and Tao introduced in a 1995 paper the scaled proximal decomposition on the graph of a maximal monotone operator (SPDG) algorithm and analysed its performance on inclusions for strongly monotone and Lipschitz continuous operators. The SPDG algorithm generalizes the Spingarn’s partial inverse method by allowing scaling factors, a key strategy to speed up the convergence of numerical algorithms. In this note, we show that the SPDG algorithm can alternatively be analysed by means of the original Spingarn’s partial inverse framework, tracing back to the 1983 Spingarn’s paper. We simply show that under the assumptions considered in by Mahey, Oualibouch and Tao, the Spingarn’s partial inverse of the underlying maximal monotone operator is strongly monotone, which allows one to employ recent results on the convergence and iteration-complexity of proximal point-type methods for strongly monotone operators. By doing this, we additionally obtain a potentially faster convergence for the SPDG algorithm and a more accurate upper bound on the number of iterations needed to achieve prescribed tolerances, specially on ill-conditioned problems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call