Abstract

For signal reconstruction in a generalized linear model (GLM), generalized approximate message passing (GAMP) is a low-complexity algorithm with many appealing features such as an exact performance characterization in the high-dimensional limit. However, it is viable only when the transformation matrix has independent and identically distributed (IID) entries. Generalized vector AMP (GVAMP) has a wider applicability but with high computational complexity. To overcome the shortcomings of GAMP and GVAMP, we propose a low-complexity and widely applicable generalized memory AMP (GMAMP) framework, including an orthogonal memory linear estimator (MLE) and two orthogonal memory nonlinear estimators (MNLE), which guarantee the asymptotic IID Gaussianity of estimation errors and state evolution (SE) in GMAMP. The proposed GMAMP is universal since the existing AMP, convolutional AMP, orthogonal/vector AMP, GVAMP, and memory AMP (MAMP) are its special instances. More importantly, we provide a principle toward building new advanced AMP-type algorithms based on the proposed GMAMP framework. As an example, we construct a Bayes-optimal GMAMP (BO-GMAMP) algorithm, which adopts a memory match filter estimator to suppress the linear interference, and thus its complexity is comparable to GAMP. Furthermore, we prove that the SE of BO-GMAMP with optimized parameters converges to the same fixed point as that of the high-complexity GVAMP. In other words, BO-GMAMP achieves the replica minimum (i.e., potential Bayes-optimal) mean square error (MSE) if its SE has a unique fixed point. Finally, simulation results are provided to validate the accuracy of the theoretical analysis.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call