Abstract

We investigate the recovery of almost s-sparse vectors x∈CN from undersampled and inaccurate data y=Ax+e∈Cm by means of minimizing ‖z‖1 subject to the equality constraints Az=y. If m≍sln(N/s) and if Gaussian random matrices A∈Rm×N are used, this equality-constrained ℓ1-minimization is known to be stable with respect to sparsity defects and robust with respect to measurement errors. If m≍sln(N/s) and if Weibull random matrices are used, we prove here that the equality-constrained ℓ1-minimization remains stable and robust. The arguments are based on two key ingredients, namely the robust null space property and the quotient property. The robust null space property relies on a variant of the classical restricted isometry property where the inner norm is replaced by the ℓ1-norm and the outer norm is replaced by a norm comparable to the ℓ2-norm. For the ℓ1-minimization subject to inequality constraints, this yields stability and robustness results that are also valid when considering sparsity relative to a redundant dictionary. As for the quotient property, it relies on lower estimates for the tail probability of sums of independent Weibull random variables.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call