Abstract

In many applications, we often need to learn the correlation of variables with block structures, and in some challenging genetic data analysis, we even need to learn the overlapping block structures for multiple groups of variables. Incorporating grouping information priors into the learning of Gaussian graphical models in a Bayesian framework is an innovative and feasible approach. We first introduce a Normal-Exponential-Gamma (NEG) structural mixture prior for the elements in the precision matrix. By embedding a structural information prior and a spike-and-slab prior, the Gaussian graphical model can induce two levels of shrinkage: the group level and the individual level. Therefore, a deterministic expectation maximization (EM) method can be used for posterior inference. Second, we add a prior distribution to learn the overlapping block structures. Then, a variational EM method is proposed for posterior inference. Simulation results show that the proposed method is able to estimate the structural sparsity with a smaller estimation bias than the existing alternative methods. Finally we use two sets of data, stock prices and gene expression, to show the application of our method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call