Abstract

The joint sparse recovery problem is a generalization of the single measurement vector problem widely studied in compressed sensing. It aims to recover a set of jointly sparse vectors, i.e., those that have nonzero entries concentrated at a common location. Meanwhile l_{p}-minimization subject to matrixes is widely used in a large number of algorithms designed for this problem, i.e., l_{2,p}-minimization minX∈Rn×r∥X∥2,ps.t. AX=B.\\documentclass[12pt]{minimal}\t\t\t\t\\usepackage{amsmath}\t\t\t\t\\usepackage{wasysym}\t\t\t\t\\usepackage{amsfonts}\t\t\t\t\\usepackage{amssymb}\t\t\t\t\\usepackage{amsbsy}\t\t\t\t\\usepackage{mathrsfs}\t\t\t\t\\usepackage{upgreek}\t\t\t\t\\setlength{\\oddsidemargin}{-69pt}\t\t\t\t\\begin{document} $$\\begin{aligned} \\min_{X \\in\\mathbb {R}^{n\\times r}} \\Vert X \\Vert _{2,p}\\quad \\text{s.t. }AX=B. \\end{aligned}$$ \\end{document}Therefore the main contribution in this paper is two theoretical results about this technique. The first one is proving that in every multiple system of linear equations there exists a constant p^{ast} such that the original unique sparse solution also can be recovered from a minimization in l_{p} quasi-norm subject to matrixes whenever 0< p<p^{ast}. The other one is showing an analytic expression of such p^{ast}. Finally, we display the results of one example to confirm the validity of our conclusions, and we use some numerical experiments to show that we increase the efficiency of these algorithms designed for l_{2,p}-minimization by using our results.

Highlights

  • 1 Introduction In sparse information processing, one of the central problems is to recover a sparse solution of an underdetermined linear system, such as visual coding [1], matrix completion [2], source localization [3], and face recognition [4]

  • That is, letting A be an underdetermined matrix of size m × n, and b ∈ Rm is a vector representing some signal, so the single measurement vector (SMV) problem is popularly modeled into the following l0-minimization: min x∈Rn x0 s.t

  • 1.2 Main contribution In this paper, we focus on the equivalence relationship between l2,p-minimization and l2,0minimization

Read more

Summary

Introduction

One of the central problems is to recover a sparse solution of an underdetermined linear system, such as visual coding [1], matrix completion [2], source localization [3], and face recognition [4]. That is, letting A be an underdetermined matrix of size m × n, and b ∈ Rm is a vector representing some signal, so the single measurement vector (SMV) problem is popularly modeled into the following l0-minimization: min x∈Rn x0 s.t. Ax = b, (1). A natural extension of single measurement vector is the joint sparse recovery problem, known as the multiple measurement vector (MMV) problem, which arises naturally in source localization [3], neuromagnetic imaging [5], and equalization of. R) are joint sparse, i.e., the solution vectors share a common support and have nonzero entries concentrated at common locations.

Objectives
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call