Abstract
We investigate the recovery of almost s-sparse vectors x∈CN from undersampled and inaccurate data y=Ax+e∈Cm by means of minimizing ‖z‖1 subject to the equality constraints Az=y. If m≍sln(N/s) and if Gaussian random matrices A∈Rm×N are used, this equality-constrained ℓ1-minimization is known to be stable with respect to sparsity defects and robust with respect to measurement errors. If m≍sln(N/s) and if Weibull random matrices are used, we prove here that the equality-constrained ℓ1-minimization remains stable and robust. The arguments are based on two key ingredients, namely the robust null space property and the quotient property. The robust null space property relies on a variant of the classical restricted isometry property where the inner norm is replaced by the ℓ1-norm and the outer norm is replaced by a norm comparable to the ℓ2-norm. For the ℓ1-minimization subject to inequality constraints, this yields stability and robustness results that are also valid when considering sparsity relative to a redundant dictionary. As for the quotient property, it relies on lower estimates for the tail probability of sums of independent Weibull random variables.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.