Abstract

The Bregman Proximal Gradient (BPG) algorithm is an algorithm for minimizing the sum of two convex functions, with one being nonsmooth. The supercoercivity of the objective function is necessary for the convergence of this algorithm precluding its use in many applications. In this paper, we give an inexact version of the BPG algorithm while circumventing the condition of supercoercivity by replacing it with a simple condition on the parameters of the problem. Our study covers the existing results, while giving other.

Highlights

  • We consider the following minimization problem: inf􏽮Ψ(x) ≔ f(x) + g(x) : x ∈ Rd􏽯. (1)where f is a convex proper lower-semicontinuous (l.s.c.) function and g is a convex continuously differentiable function. is problem arises in many applications including compressed sensing [1], signal recovery [2], and phase retrieve problem [3]

  • Where λn is the stepsize on each iteration. e Proximal Gradient Method and its variants [4,5,6,7,8,9,10,11,12,13,14] have been one hot topic in optimization field for a long time due to their simple forms

  • The authors could replace the intricate question of Lipschitz continuity of gradients by a convex condition easy to verify, which we call below like/ Convexity Condition (LC) property. ereby, they proposed and studied the algorithm called NoLips defined by xn

Read more

Summary

Introduction

We consider the following minimization problem: inf􏽮Ψ(x) ≔ f(x) + g(x) : x ∈ Rd􏽯. where f is a convex proper lower-semicontinuous (l.s.c.) function and g is a convex continuously differentiable function. is problem arises in many applications including compressed sensing [1], signal recovery [2], and phase retrieve problem [3]. One classical algorithm for solving this problem is the proximal gradient (PG) method: xn ≔ argmin􏼨f(u) + 〈∇g􏼐xn− 1􏼑, u〉 + 1 u − xn− 1 2􏼩 2λn n ∈ N∗. A central property required in the analysis of gradient methods is that of the Lipschitz continuity of the gradient of the smooth part g. In many applications, the differentiable function does not have such a property, e.g., in the broad class of Poisson inverse problems. The authors could replace the intricate question of Lipschitz continuity of gradients by a convex condition easy to verify, which we call below LC property. While circumventing the condition of supercoercivity required in [15, 22] by replacing it with a simple condition on the parameters of the problem, our study covers the existing results, while giving others. Subdifferential (3) argmin f 􏼈x ∈ Rd, f(x) inf f􏼉 its argmin f (4) ε − argmin f 􏼈x ∈ Rd, f(x) ≤ inff + ε􏼉 its ε−

Preliminary
Main Results
Application to Nonnegative Linear Inverse Problem
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call