Abstract

Orthogonal least squares (OLS) is a classic algorithm for sparse recovery and subset selection. In this paper, we analyze the performance guarantee of the OLS algorithm using the restricted isometry property (RIP) framework. Specifically, we show that OLS exactly recovers anyK-sparse signal in K iterations, provided that a sampling matrix satisfies the RIP with\begin{equation*}\delta_{K+1} \lt \frac{1}{\sqrt{(1+\delta_{K+1})K+\frac{1}{4}}+\frac{1}{2}}.\end{equation*}Our result bridges the gap between the recent result of Wenet al. and the fundamental limit of OLS at which the exact reconstruction cannot be uniformly guaranteed. Furthermore, we show that the OLS algorithm is stable under measurement noise. Specifically, we show that if the signal-to-noise ratio (SNR) scales linearly with the sparsity of an input signal, then the $\ell_{2}$-norm of the recovery error is bounded by the noise power.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call