Abstract

We consider the least squares regression problem, penalized with a combination of the \(\ell _{0}\) and squared \(\ell _{2}\) penalty functions (a.k.a. \(\ell _0 \ell _2\) regularization). Recent work shows that the resulting estimators enjoy appealing statistical properties in many high-dimensional settings. However, exact computation of these estimators remains a major challenge. Indeed, modern exact methods, based on mixed integer programming (MIP), face difficulties for problems where the number of features \(p \sim 10^4\). In this work, we present a new exact MIP framework for \(\ell _0\ell _2\)-regularized regression that can scale to \(p \sim 10^7\), achieving speedups of at least 5000x, compared to state-of-the-art exact methods. Unlike recent work, which relies on commercial MIP solvers, we design a specialized nonlinear branch-and-bound (BnB) framework, by critically exploiting the problem structure. A key distinguishing component in our framework lies in efficiently solving the node relaxations using a specialized first-order method, based on coordinate descent (CD). Our CD-based method effectively leverages information across the BnB nodes through using warm starts, active sets, and gradient screening. In addition, we design a novel method for obtaining dual bounds from primal CD solutions, which certifiably works in high dimensions. Experiments on synthetic and real high-dimensional datasets demonstrate that our framework is not only significantly faster than the state of the art, but can also deliver certifiably optimal solutions to statistically challenging instances that cannot be solved with existing methods. We open source the implementation through our toolkit L0BnB.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call