Abstract
Denoising has to do with estimating a signal $$\mathbf {x}_0$$x0 from its noisy observations $$\mathbf {y}=\mathbf {x}_0+\mathbf {z}$$y=x0+z. In this paper, we focus on the structured denoising where the signal $$\mathbf {x}_0$$x0 possesses a certain structure and $$\mathbf {z}$$z has independent normally distributed entries with mean zero and variance $$\sigma ^2$$ź2. We employ a structure-inducing convex function $$f(\cdot )$$f(·) and solve $$\min _\mathbf {x}\{\frac{1}{2}\Vert \mathbf {y}-\mathbf {x}\Vert _2^2+\sigma {\lambda }f(\mathbf {x})\}$$minx{12źy-xź22+źźf(x)} to estimate $$\mathbf {x}_0$$x0, for some $$\lambda >0$$ź>0. Common choices for $$f(\cdot )$$f(·) include the $$\ell _1$$l1 norm for sparse vectors, the $$\ell _1-\ell _2$$l1-l2 norm for block-sparse signals and the nuclear norm for low-rank matrices. The metric we use to evaluate the performance of an estimate $$\mathbf {x}^*$$xź is the normalized mean-squared error $$\text {NMSE}(\sigma )=\frac{{\mathbb {E}}\Vert \mathbf {x}^*-\mathbf {x}_0\Vert _2^2}{\sigma ^2}$$NMSE(ź)=Eźxź-x0ź22ź2. We show that NMSE is maximized as $$\sigma \rightarrow 0$$źź0 and we find the exact worst-case NMSE, which has a simple geometric interpretation: the mean-squared distance of a standard normal vector to the $${\lambda }$$ź-scaled subdifferential $${\lambda }\partial f(\mathbf {x}_0)$$źźf(x0). When $${\lambda }$$ź is optimally tuned to minimize the worst-case NMSE, our results can be related to the constrained denoising problem $$\min _{f(\mathbf {x})\le f(\mathbf {x}_0)}\{\Vert \mathbf {y}-\mathbf {x}\Vert _2\}$$minf(x)≤f(x0){źy-xź2}. The paper also connects these results to the generalized LASSO problem, in which one solves $$\min _{f(\mathbf {x})\le f(\mathbf {x}_0)}\{\Vert \mathbf {y}-{\mathbf {A}}\mathbf {x}\Vert _2\}$$minf(x)≤f(x0){źy-Axź2} to estimate $$\mathbf {x}_0$$x0 from noisy linear observations $$\mathbf {y}={\mathbf {A}}\mathbf {x}_0+\mathbf {z}$$y=Ax0+z. We show that certain properties of the LASSO problem are closely related to the denoising problem. In particular, we characterize the normalized LASSO cost and show that it exhibits a phase transition as a function of number of observations. We also provide an order-optimal bound for the LASSO error in terms of the mean-squared distance. Our results are significant in two ways. First, we find a simple formula for the performance of a general convex estimator. Secondly, we establish a connection between the denoising and linear inverse problems.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.