Abstract

Recently, compressed sensing has been widely applied to various areas such as signal processing, machine learning, and pattern recognition. To find the sparse representation of a vector w.r.t. a dictionary, an ℓ1 minimization problem, which is convex, is usually solved in order to overcome the computational difficulty. However, to guarantee that the ℓ1 minimizer is close to the sparsest solution, strong incoherence conditions should be imposed. In comparison, nonconvex minimization problems such as those with the ℓp(0<p<1) penalties require much weaker incoherence conditions and smaller signal to noise ratio to guarantee a successful recovery. Hence the ℓp(0<p<1) regularization serves as a better alternative to the popular ℓ1 one. In this paper, we review some typical algorithms, Iteratively Reweightedℓ1minimization (IRL1), Iteratively Reweighted Least Squares (IRLS) (and its general form General Iteratively Reweighted Least Squares (GIRLS)), and Iteratively Thresholding Method (ITM), for ℓp minimization and do comprehensive comparison among them, in which IRLS is identified as having the best performance and being the fastest as well.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call