Abstract

In this paper, we study the issue of estimating a structured signal $x_0 \in \mathbb{R}^n$ from non-linear and noisy Gaussian observations. Supposing that $x_0$ is contained in a certain convex subset $K \subset \mathbb{R}^n$, we prove that accurate recovery is already feasible if the number of observations exceeds the effective dimension of $K$, which is a common measure for the complexity of signal classes. It will turn out that the possibly unknown non-linearity of our model affects the error rate only by a multiplicative constant. This achievement is based on recent works by Plan and Vershynin, who have suggested to treat the non-linearity rather as noise which perturbs a linear measurement process. Using the concept of restricted strong convexity, we show that their results for the generalized Lasso can be extended to a fairly large class of convex loss functions. Moreover, we shall allow for the presence of adversarial noise so that even deterministic model inaccuracies can be coped with. These generalizations particularly give further evidence of why many standard estimators perform surprisingly well in practice, although they do not rely on any knowledge of the underlying output rule. To this end, our results provide a unified and general framework for signal reconstruction in high dimensions, covering various challenges from the fields of compressed sensing, signal processing, and statistical learning.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call