Abstract

In this paper, we propose, analyze, and test an alternative method for solving the ℓ 1 -norm regularization problem for recovering sparse signals and blurred images in compressive sensing. The method is motivated by the recent proposed nonlinear conjugate gradient method of Tang, Li and Cui [Journal of Inequalities and Applications, 2020(1), 27] designed based on the least-squares technique. The proposed method aims to minimize a non-smooth minimization problem consisting of a least-squares data fitting term and an ℓ 1 -norm regularization term. The search directions generated by the proposed method are descent directions. In addition, under the monotonicity and Lipschitz continuity assumption, we establish the global convergence of the method. Preliminary numerical results are reported to show the efficiency of the proposed method in practical computation.

Highlights

  • Discrete ill-posed problems are systems of linear equations arising from the discretization of ill-posed problems

  • Inspired by the work of Xiao and Zhu [12], the least-squares-based three-term conjugate gradient method (LSTT) for solving unconstrained optimization problems by Tang, Li, and Cui [18] and the projection technique of Solodov and Svaiter [20], we further study, analyze, and construct a derivative-free least-square-based three-term conjugate gradient method to solve the1 -norm problem arising from the reconstruction of sparse signal and image in compressive sensing

  • We can see that the signal-to-noise ratio (SNR), peak signal-to-noise ratio (PSNR), and structural similarity index (SSIM) of the test images calculated by the DF-LSTT algorithm are a bit higher than conjugate gradient method (CGD), SGCS, and MFRM

Read more

Summary

Introduction

Discrete ill-posed problems are systems of linear equations arising from the discretization of ill-posed problems. Using the popular CG_DESCENT method [19], Xiao and Zhu [12] recently constructed a conjugate gradient method (CGD) based on the projection scheme of Solodov and Svaiter [20] to solve monotone nonlinear operator equations with convex constraints. Inspired by the work of Xiao and Zhu [12], the least-squares-based three-term conjugate gradient method (LSTT) for solving unconstrained optimization problems by Tang, Li, and Cui [18] and the projection technique of Solodov and Svaiter [20], we further study, analyze, and construct a derivative-free least-square-based three-term conjugate gradient method to solve the1 -norm problem arising from the reconstruction of sparse signal and image in compressive sensing. The projection map denoted as PS , which is a mapping from Rn onto the non-empty, closed and convex subset S ⊆ Rn , that is, PS (t) := arg min{kt − yk |y ∈ S}, which has the well known nonexpansive property, that is, k PS (h) − PS ( g)k ≤ kh − gk, ∀h, g ∈ Rn

Reformulation of the Model
Algorithm
Global Convergence
Numerical Experiment
Experiments on the1 -Norm Regularization Problem in Compressive Sensing
Experiments on Some Large-Scaled Monotone Nonlinear Equations
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call