Abstract

Suppose that ${ { y}}= \lvert \text {A} { x}_{0}\rvert +\eta $ where ${ x}_{0}\in {\mathbb R} ^{{d}}$ is the target signal and $\eta \in {\mathbb R}^{{m}}$ is a noise vector. The aim of phase retrieval is to estimate ${x}_{0}$ from ${ { y}}$ . A popular model for estimating ${ x}_{0}$ is the nonlinear least squares ${\widehat { {x}}}:={\mathrm{ argmin}}_{ x} \| \lvert \text {A} { x}\rvert - { { y}}\|_{2}$ . One has already developed many efficient algorithms for solving the model, such as the seminal error reduction algorithm. In this paper, we present the estimation performance of the model with proving that $\| {\widehat { {x}}}- { x}_{0}\|\lesssim {\|\eta \|_{2}}/{\sqrt {{m}}}$ under the assumption of A being a Gaussian random matrix. We also prove the reconstruction error ${\|\eta \|_{2}}/{\sqrt {{m}}}$ is sharp. For the case where ${ x}_{0}$ is sparse, we study the estimation performance of both the nonlinear Lasso of phase retrieval and its unconstrained version. Our results are non-asymptotic, and we do not assume any distribution on the noise $\eta $ . To the best of our knowledge, our results represent the first theoretical guarantee for the nonlinear least squares and for the nonlinear Lasso of phase retrieval.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call