Abstract

Recently, compressed sensing has been widely applied to various areas such as signal processing, machine learning, and pattern recognition. To find the sparse representation of a vector w.r.t. a dictionary, an ℓ1 minimization problem, which is convex, is usually solved in order to overcome the computational difficulty. However, to guarantee that the ℓ1 minimizer is close to the sparsest solution, strong incoherence conditions should be imposed. In comparison, nonconvex minimization problems such as those with the ℓp(0<p<1) penalties require much weaker incoherence conditions and smaller signal to noise ratio to guarantee a successful recovery. Hence the ℓp(0<p<1) regularization serves as a better alternative to the popular ℓ1 one. In this paper, we review some typical algorithms, Iteratively Reweightedℓ1minimization (IRL1), Iteratively Reweighted Least Squares (IRLS) (and its general form General Iteratively Reweighted Least Squares (GIRLS)), and Iteratively Thresholding Method (ITM), for ℓp minimization and do comprehensive comparison among them, in which IRLS is identified as having the best performance and being the fastest as well.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.