Abstract
This paper concerns sparse Bayesian learning (SBL) problem for group sparse signals. Group sparsity means that the signal components can be divided into groups, and the entries in one group are simultaneously zero or nonzero. In SBL, each group is controlled by a hyper-parameter. The marginal likelihood maximization (MLM) problem is to maximize the marginal likelihood of a given hyper-parameter by fixing all others. The main contribution of this paper is to solve the MLM problem by finding roots of a polynomial. Hence the global minimum of the marginal likelihood can be found efficiently. Furthermore, most large matrix inverses involved in MLM are replaced with the singular value decompositions of much smaller matrices, which substantially reduces the computational complexity. The proposed method is significantly different from the popular expectation maximization techniques in the literature where multiple iterations are required for MLM and the convergence to global optimum of marginal likelihood is not guaranteed.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have