Abstract

Compressed sensing (CS) aims for seeking appropriate algorithms to recover a sparse vector from noisy linear observations. Currently, various Bayesian-based algorithms such as sparse Bayesian learning (SBL) and approximate message passing (AMP) based algorithms have been proposed. For SBL, it has accurate performance with robustness while its computational complexity is high due to matrix inversion. For AMP, its performance is guaranteed by the severe restriction of the measurement matrix, which limits its application in solving CS problem. To overcome the drawbacks of the above algorithms, in this paper, we present a low complexity algorithm for the single linear model that incorporates the vector AMP (VAMP) into the SBL structure with expectation maximization (EM). Specifically, we apply the variance auto-tuning into the VAMP to implement the E step in SBL, which decrease the iterations that require to converge compared with VAMP-EM algorithm when using a Gaussian mixture (GM) prior. Simulation results show that the proposed algorithm has better performance with high robustness under various cases of difficult measurement matrices.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call