Abstract

Deep neural networks and model-based methods are both popular for their wide and great success in many inference problems. In this paper, resorting to deep learning, we study the efficient algorithms for two popular nonconvex regularization methods, smoothly clipped absolute deviation (SCAD) and minimax concave penalty (MCP). Approximate message passing (AMP) algorithm can be effective to optimize nonconvex regularization models. First, we unroll the AMP-based algorithm as the feed-forward neural network through leveraging the novel activation functions of neurons, dubbed as Unrolled-AMP. And then, for the case where the measurement matrix deviates from the i.i.d. Gaussian distribution, we propose two improved iterative algorithms based on the “Vector AMP (VAMP)” algorithm to solve the nonconvex regularization methods. Further, we unroll them as the feed-forward neural network, dubbed as Unrolled-VAMP. These two novel neural network architectures use a back-propagation algorithm to learn all their parameters from the training data. Finally, the convergence of Unrolled-AMP algorithm is analyzed, and the efficiency of the proposed networks is demonstrated through experiments on sparse signal reconstruction and 5G wireless communication.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call