Abstract

Learning a Gaussian graphical model with latent variables is ill posed when there is insufficient sample complexity, thus having to be appropriately regularized. A common choice is convex ℓ 1 plus nuclear norm to regularize the searching process. However, the best estimator performance is not always achieved with these additive convex regularizations, especially when the sample complexity is low. In this paper, we consider a concave additive regularization which does not require the strong irrepresentable condition. We use concave regularization to correct the intrinsic estimation biases from Lasso and nuclear penalty as well. We establish the proximity operators for our concave regularizations, respectively, which induces sparsity and low rankness. In addition, we extend our method to also allow the decomposition of fused structure-sparsity plus low rankness, providing a powerful tool for models with temporal information. Specifically, we develop a nontrivial modified alternating direction method of multipliers with at least local convergence. Finally, we use both synthetic and real data to validate the excellence of our method. In the application of reconstructing two-stage cancer networks, “the Warburg effect” can be revealed directly.

Highlights

  • Learning a graphical model from high-dimensional but partial observations is ill posed, leading to infinitely numerous solutions

  • We design a gradient-based but nonsmooth optimization based on alternating direction method of multipliers

  • Before we prove the convergence result, we need to prove the following contraction property which is the key for the proof of the convergence of general alternating direction method of multipliers (ADMM) [29]

Read more

Summary

Introduction

Learning a graphical model from high-dimensional but partial observations is ill posed, leading to infinitely numerous solutions. The overall goal of this paper is to develop a computational framework for a concave regularization with additive sparse and low-rank constraints [9] because of our desire to encode latent variables in a Gaussian graphical model but with insufficient samples, a very important issue in gene interaction network with latent regulatory factors. Zhang established a bound by imposing an appropriate l2 regularity condition such that a family of column-normalized matrices can guarantee a desirable estimation under an appropriate sparsity assumption [10], leading to the error bound that is no worse than Lasso This general result holds for the entire concave regularization family including the bridge penalties (lq, q < 1). Overall we have developed an unified computational approach for additive concave regularization

Latent Gaussian Graphical Model with Additive Concave Regularization
Latent Gaussian Graphical Model with Additive Concave
Modified Alternating Direction Method of Multipliers
Proximity Operator
Numerical Evaluation
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call