Abstract

Solving $l_1$ regularized optimization problems is common in the fields of computational biology, signal processing, and machine learning. Such $l_1$ regularization is utilized to find sparse minimizers of convex functions. A well-known example is the least absolute shrinkage and selection operator (LASSO) problem, where the $l_1$ norm regularizes a quadratic function. A multilevel framework is presented for solving such $l_1$ regularized sparse optimization problems efficiently. We take advantage of the expected sparseness of the solution, and create a hierarchy of problems of similar type, which is traversed in order to accelerate the optimization process. This framework is applied for solving two problems: (1) the sparse inverse covariance estimation problem, and (2) $l_1$ regularized logistic regression. In the first problem, the inverse of an unknown covariance matrix of a multivariate normal distribution is estimated, under the assumption that it is sparse. To this end, an $l_1$ regularized log-dete...

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call