Abstract

We study an extension of the proximal method for convex programming, where the quadratic regularization kernel is substituted by a class of convex statistical distances, called φ-divergences, which are typically entropy-like in form. After establishing several basic properties of these quasi-distances, we present a convergence analysis of the resulting entropy-like proximal algorithm. Applying this algorithm to the dual of a convex program, we recover a wide class of nonquadratic multiplier methods and prove their convergence.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call