Abstract

We address the problem of compressive phase retrieval (CPR) based on generative prior. The problem is ill-posed and requires structural assumptions. CPR techniques impose sparsity prior on the signal to perform reconstruction from compressive phaseless measurements. Recent developments in data-driven signal models in the form of generative priors have been shown to outperform sparsity priors with significantly fewer measurements. However, it is possible to improve upon the performance of generative prior based methods by introducing structure in the latent-space. We propose to introduce structure on the signal by enforcing sparsity in the latent-space via proximal method while training the generator. The optimization is called as proximal meta-learning (PML). Enforcing sparsity in the latent space naturally leads to a union-of-submanifolds model in the solution space. The overall framework of imposing sparsity along with PML is called as sparsity-driven latent space sampling (SDLSS). We demonstrate the efficacy of the proposed framework over the state-of-the-art deep phase retrieval (DPR) technique on MNIST and CelebA datasets. We evaluate the performance as a function of the number of measurements and sparsity factor using standard objective measures. The results show that SDLSS performs better at higher compression ratio and has faster recovery compared with DPR.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call