Abstract

Compressed sensing (CS) states that a sparse signal can exactly be recovered from very few linear measurements. While in many applications, real-world signals also exhibit additional structures aside from standard sparsity. The typical example is the so-called block-sparse signals whose non-zero coefficients occur in a few blocks. In this article, we investigate the mixed l 2/l q (0 < q ≤ 1) norm minimization method for the exact and robust recovery of such block-sparse signals. We mainly show that the non-convex l 2/l q (0 < q < 1) minimization method has stronger sparsity promoting ability than the commonly used l 2/l 1 minimization method both practically and theoretically. In terms of a block variant of the restricted isometry property of measurement matrix, we present weaker sufficient conditions for exact and robust block-sparse signal recovery than those known for l 2/l 1 minimization. We also propose an efficient Iteratively Reweighted Least-Squares (IRLS) algorithm for the induced non-convex optimization problem. The obtained weaker conditions and the proposed IRLS algorithm are tested and compared with the mixed l 2/l 1 minimization method and the standard l q minimization method on a series of noiseless and noisy block-sparse signals. All the comparisons demonstrate the outperformance of the mixed l 2/l q (0 < q < 1) method for block-sparse signal recovery applications, and meaningfulness in the development of new CS technology.

Highlights

  • According to the Shannon/Nyquist sampling theorem [1,2], if we would like to avoid lose of information when capturing a signal, we must sample the signal at the socalled Nyquist rate, which means twice the highest frequency of the signal

  • Since the theorem only exploits the bandlimitedness of a signal and most real-world signals are sparse or compressible, the process of massive data acquisition based on Shannon/Nyquist sampling theorem usually samples too many useless information and eventually we have to compress to store or encode a very few essential information of the signal

  • Huang and Zhang [13] developed a theory for the mixed l2/l1-minimization by using a concept called strong group sparsity and they demonstrated that the mixed norm minimization is very efficient for recovering strongly group-sparse signals

Read more

Summary

Introduction

According to the Shannon/Nyquist sampling theorem [1,2], if we would like to avoid lose of information when capturing a signal, we must sample the signal at the socalled Nyquist rate, which means twice the highest frequency of the signal. Eldar and Mishali [29] generalized the sufficient recovery conditions to the block-sparse signals both in noiseless and noisy settings They showed that if is taken random as conventional CS, it satisfies the block-RIP with overwhelming probability. All these results illustrated that one can recover a block-sparse signal exactly and stably via the convex mixed l1/l2 minimization method whenever the measurement matrix is constructed from a random ensemble (i.e., Gaussian ensemble). We will provide some sufficient conditions for exact and stable recovery of block-sparse signals through the mixed l2/lq(0 < q < 1) norm minimization, and further develop a similar IRLS algorithm as in [28,33] for solutions of such non-convex optimization problem

Sufficient block-sparse recovery conditions
Noiseless recovery
Noisy recovery
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call