Abstract

When the sparse regularizer is convex and its proximal operator has a closed‐form, first‐order iterative algorithms based on proximal operators can effectively solve the sparse optimization problems. Recently, plug‐and‐play (PnP) algorithms have achieved significant success by incorporating advanced denoisers to replace the proximal operators in iterative algorithms. However, convex sparse regularizers such as the ‐norm tend to underestimate the large values within the sparse solutions. In contrast, the convex non‐convex (CNC) sparse regularization enables the non‐convex regularizer while preserving the convexity of the objective function. In this paper, we propose several PnP algorithms for solving the CNC sparse regularization model and discuss their convergence properties. Specifically, we first derive the proximal operator for CNC sparse regularization in iterative form and subsequently integrate it with several first‐order algorithms to yield different PnP algorithms. Then, based on the monotone operator theory, we prove that the proposed PnP algorithms are convergent under the condition that the data‐fidelity term is strongly convex and the residuals of the denoisers are contractive. We also emphasized the significance of strong convexity in the data‐fidelity term for the CNC sparse regularization model. Additionally, we demonstrate the superiority of the proposed PnP algorithms through extensive image restoration tasks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call