Abstract

In many compressive sensing reconstruction algorithms, a good choice of important parameters such as the optimal number of measurements (which is dependent upon the unknown signal sparsity) or the regularization parameter, is critical for successful signal recovery. Cross-validation provides a principled method of doing so, by dividing the measurements into a ‘reconstruction set’ to recover the signal given each candidate parameter value, and a ‘cross-validation set’ to determine in a purely data-driven manner as to which candidate parameter value is optimal. In previous work, such a technique has been theoretically analyzed for the case of noiseless compressive measurements or for the case of additive i.i.d. Gaussian noise in the measurements. This paper presents the first theoretical analysis of this technique for compressed sensing when the measurements are corrupted by Poisson noise, the dominant noise distribution in optical systems. Using two different methods, we present different types of theoretical bounds in the form of confidence intervals on the actual (unobservable) recovery error in terms of the (observable) cross-validation error. We show in particular that these bounds become tighter with increase in the underlying signal intensity or number of cross-validation measurements (if other factors are held constant). We validate our theoretical bounds with simulations and experimental results for Poisson compressed sensing with the widely used Lasso estimator.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call