Abstract
Approximate Message Passing (AMP) is an efficient iterative parameter-estimation technique for certain high-dimensional linear systems with non-Gaussian distributions, such as sparse systems. In AMP, a so-called Onsager term is added to keep estimation errors approximately Gaussian. Orthogonal AMP (OAMP) does not require this Onsager term, relying instead on an orthogonalization procedure to keep the current errors uncorrelated with (i.e., orthogonal to) past errors. In this paper, we show the generality and significance of the orthogonality in ensuring that errors are “asymptotically independently and identically distributed Gaussian” (AIIDG). This AIIDG property, which is essential for the attractive performance of OAMP, holds for separable functions. We present a simple and versatile procedure to establish the orthogonality through Gram-Schmidt (GS) orthogonalization, which is applicable to any prototype. We show that different AMP-type algorithms, such as expectation propagation (EP), turbo, AMP and OAMP, can be unified under the orthogonal principle. The simplicity and generality of OAMP provide efficient solutions for estimation problems beyond the classical linear models. As an example, we study the optimization of OAMP via the GS model and GS orthogonalization. More related applications will be discussed in a companion paper where new algorithms are developed for problems with multiple constraints and multiple measurement variables.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.