Abstract The non-convex α ∥ ⋅ ∥ ℓ 1 − β ∥ ⋅ ∥ ℓ 2 \alpha\lVert\,{\cdot}\,\rVert_{\ell_{1}}-\beta\lVert\,{\cdot}\,\rVert_{\ell_{2}} ( α ≥ β ≥ 0 \alpha\geq\beta\geq 0 ) regularization is a new approach for sparse recovery. A minimizer of the α ∥ ⋅ ∥ ℓ 1 − β ∥ ⋅ ∥ ℓ 2 \alpha\lVert\,{\cdot}\,\rVert_{\ell_{1}}-\beta\lVert\,{\cdot}\,\rVert_{\ell_{2}} regularized function can be computed by applying the ST-( α ℓ 1 − β ℓ 2 \alpha\ell_{1}-\beta\ell_{2} ) algorithm which is similar to the classical iterative soft thresholding algorithm (ISTA). Unfortunately, It is known that ISTA converges quite slowly, and a faster alternative to ISTA is the projected gradient (PG) method. Nevertheless, the current applicability of the PG method is limited to linear inverse problems. In this paper, we extend the PG method based on a surrogate function approach to nonlinear inverse problems with the α ∥ ⋅ ∥ ℓ 1 − β ∥ ⋅ ∥ ℓ 2 \alpha\lVert\,{\cdot}\,\rVert_{\ell_{1}}-\beta\lVert\,{\cdot}\,\rVert_{\ell_{2}} ( α ≥ β ≥ 0 \alpha\geq\beta\geq 0 ) regularization in the finite-dimensional space R n \mathbb{R}^{n} . It is shown that the presented algorithm converges subsequentially to a stationary point of a constrained Tikhonov-type functional for sparsity regularization. Numerical experiments are given in the context of a nonlinear compressive sensing problem to illustrate the efficiency of the proposed approach.
Read full abstract