Abstract

Vector Approximate Message Passing (VAMP) provides the means of solving a linear inverse problem in a Bayes-optimal way assuming the measurement operator is sufficiently random. However, VAMP requires implementing the linear minimum mean squared error (LMMSE) estimator at every iteration, which makes the algorithm intractable for large-scale problems. In this work, we present a class of warm-started (WS) methods that provides a scalable approximation of LMMSE within VAMP. We show that a Message Passing (MP) algorithm equipped with a method from this class can converge to the fixed point of VAMP while having a per-iteration computational complexity proportional to that of AMP. Additionally, we provide the Onsager correction and a multi-dimensional State Evolution for MP utilizing one of the WS methods. Lastly, we show that the approximation approach used in the recently proposed Memory AMP (MAMP) algorithm is a special case of the developed class of WS methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call