Abstract

This paper proposes Bayes-optimal convolutional approximate message-passing (CAMP) for signal recovery in compressed sensing. CAMP uses the same low-complexity matched filter (MF) for interference suppression as approximate message-passing (AMP). To improve the convergence property of AMP for ill-conditioned sensing matrices, the so-called Onsager correction term in AMP is replaced by a convolution of all preceding messages. The tap coefficients in the convolution are determined so as to realize asymptotic Gaussianity of estimation errors via state evolution (SE) under the assumption of orthogonally invariant sensing matrices. An SE equation is derived to optimize the sequence of denoisers in CAMP. The optimized CAMP is proved to be Bayes-optimal for all orthogonally invariant sensing matrices if the SE equation converges to a fixed-point and if the fixed-point is unique. For sensing matrices with low-to-moderate condition numbers, CAMP can achieve the same performance as high-complexity orthogonal/vector AMP that requires the linear minimum mean-square error (LMMSE) filter instead of the MF.

Highlights

  • The convolutional approximate message-passing (CAMP) can achieve the same performance as orthogonal AMP (OAMP)/VAMP for sensing matrices with low-to-moderate condition numbers while it is inferior to OAMP/VAMP for high condition numbers

  • The Bayes-optimal CAMP solves the disadvantages of approximate message-passing (AMP) and OAMP/VAMP, and realizes their advantages for orthogonally invariant sensing matrices with low-to-moderate condition numbers: The Bayes-optimal CAMP is an efficient MP algorithm that has comparable complexity to AMP

  • ON ×r consists of all left-singular vectors corresponding to r non-zero singular values while Φ⊥M ∈ ON×(N−r) is composed of left-singular vectors corresponding to N − r zero singular values

Read more

Summary

Compressed Sensing

C OMPRESSED sensing (CS) [1], [2] is a powerful technique for recovering sparse signals from compressed measurements. In the noiseless case w = 0, Wu and Verdu [3] proved that, if and only if the compression rate δ = M/N is equal to or larger than the information dimension, there are some sensing matrix A and method for signal recovery such that the signal vector x can be recovered with negligibly small error probability. An important issue in CS is a construction of practical sensing matrices and a low-complexity algorithm for signal recovery achieving the information-theoretic compression limit. The information-theoretic compression limit of zero-mean i.i.d. sensing matrices was analyzed with the nonrigorous replica method [7], [8]—a tool developed in statistical mechanics [9], [10]. Haar orthogonal sensing matrices achieve the compression rate that is equal to the Renyi information dimension. An ultimate algorithm for signal recovery is required to be low complexity and Bayes-optimal for all orthogonally invariant sensing matrices

Message-Passing
Methodology
Contributions
Organization
Notation
Definitions and Assumptions
General Error Model
State Evolution
Convolutional Approximate Message-Passing
Error Model
Asymptotic Gaussianity
SE Equation
Implementation
Simulation Conditions
Ill-Conditioned Sensing Matrices
CONCLUSIONS
Formulation
Proof by Induction
SE Equations
Generating Functions
Evaluation at Poles
Time-Domain Representation
Summary
Bayes-Optimal Denoiser
Sufficient Statistic
Correlation
Joint pdf
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call