Abstract
Variational methods have become an important kind of methods in signal and image restoration - a typical inverse problem. One important minimization model consists of the squared $\ell_2$ data fidelity (corresponding to Gaussian noise) and a regularization term constructed by a potential function composed of first order difference operators. It is well known that total variation (TV) regularization, although achieved great successes, suffers from a contrast reduction effect. Using a typical signal, we show that, actually all convex regularizers and most nonconvex regularizers have this effect. With this motivation, we present a general truncated regularization framework. The potential function is a truncation of existing nonsmooth potential functions and thus flat on $(\tau,+\infty)$ for some positive $\tau$. Some analysis in 1D theoretically demonstrate the good contrast-preserving ability of the framework. We also give optimization algorithms with convergence verification in 2D, where global minimizers of each subproblem (either convex or nonconvenx) are calculated. Experiments numerically show the advantages of the framework.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.