Abstract

We investigate the efficiency of orthogonal super greedy algorithm (OSGA) for sparse recovery and approximation under the restricted isometry property (RIP). We first show that under the RIP conditions of the measurement matrix Φ and the minimum magnitude of the nonzero coordinates of the signal, for l_{2} bounded or l_{infty } bounded noise vector e, with explicit stopping rules, OSGA can recover the support of an arbitrary K-sparse signal x from y=varPhi x +e in at most K steps. Then, we investigate the error performance of OSGA in m term approximation with regards to dictionaries satisfying the RIP in a separable Hilbert space. We establish a Lebesgue-type inequality for OSGA. Based on this inequality, we obtain the optimal rate of convergence for the sparse class induced by such dictionaries.

Highlights

  • Recovery and approximation by sparse linear combination of elements from a fixed redundant family is frequently utilized in many application areas, such as image or signal processing, PDE solvers and statistical learning, see [1]

  • We consider the orthogonal super greedy algorithm (OSGA), which is more efficient than orthogonal greedy algorithm (OGA) from the viewpoint of computational complexity

  • We show that OSGA can recover a K -sparse signal from noisy measurements in at most K steps

Read more

Summary

Introduction

Recovery and approximation by sparse linear combination of elements from a fixed redundant family is frequently utilized in many application areas, such as image or signal processing, PDE solvers and statistical learning, see [1]. 2, we will study the efficiency of the OSGA in recovering N -dimensional sparse signal from linear measurements. This topic is known in the literature as compressed sensing (CS), [13,14,15]. OSGA(s) generalizes the orthogonal greedy algorithm (OGA) in the sense that it selects multiple s indices in each iteration, it can recover the sparse signal using fewer steps and further reduce the complexity. To investigate the efficiency of the OMMP in CS, we use the restricted isometry property (RIP) condition of Φ which ensures the stable recovery of x from noisy measurements This property is introduced by Candes and Tao [16, 17] as follows.

The constant was further relaxed as
RN from y e with probability at least
This inequality can be written as rk
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call