Abstract

Estimating large and sparse inverse covariance matrix plays a fundamental role in modern multivariate analysis, because the zero entries capture the conditional independence between pairs of variables given all other variables. This estimation task can be realized by penalizing the maximum likelihood estimation with an adaptive group lasso penalty imposed directly on the elements of the inverse, which allows the estimated to have a blockwise sparse structure that is particularly useful in some applications. In the paper, we are particularly interested in studying the implementation of optimization algorithms for minimizing a class of log-determinant model. This considered minimization model, one the one hand, contains a large number of popular sparse models as special cases, but on the other hand, it poses more challenges especially in high-dimensional situations. Instead of targeting the challenging optimization problem directly, we employ the symmetric Gauss–Seidel (sGS) iteration based alternating direction method of multipliers (ADMM) to tackle the 3-block nonsmooth dual program. By choosing an appropriate proximal term, it was shown that the implemented sGS-ADMM is equivalent to the 2-block ADMM, so its convergence is followed directly from some existing theoretical results. Numerical experiments on synthetic data and real data sets, including the performance comparisons with the directly extended ADMM, demonstrate that the implemented algorithm is effective in estimating large and sparse inverse covariance matrices.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call