Abstract
SummarySeveral methods have recently been proposed for estimating sparse Gaussian graphical models using $\ell_{1}$-regularization on the inverse covariance or precision matrix. Despite recent advances, contemporary applications require even faster methods to handle ill-conditioned high-dimensional datasets. In this paper, we propose a new method for solving the sparse inverse covariance estimation problem using the alternating minimization algorithm, which effectively works as a proximal gradient algorithm on the dual problem. Our approach has several advantages: it is faster than state-of-the-art algorithms by many orders of magnitude; its global linear convergence has been rigorously demonstrated, underscoring its good theoretical properties; it facilitates additional constraints on pairwise or marginal relationships between feature pairs based on domain-specific knowledge; and it is better at handling extremely ill-conditioned problems. Our algorithm is shown to be more accurate and faster on simulated and real datasets.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.