Abstract
• A Selective Linearization (SLIN) algorithm is introduced to optimize sum of several convex non-smooth functions. • SLIN is a fast operator splitting method that guarantees global convergence and convergence rate without artificial duplication of variables. • Novel and efficient methods for solving subproblems of overlapping group Lasso and doubly regularized support vector machine are introduced. • Numerical experiments using data from simulation, cancer research, and Amazon review demonstrate the efficacy and accuracy of the method. We consider the problem of minimizing a sum of several convex non-smooth functions and discuss the selective linearization method (SLIN), which iteratively linearizes all but one of the functions and employs simple proximal steps. The algorithm is a form of multiple operator splitting in which the order of processing partial functions is not fixed, but rather determined in the course of calculations. SLIN is globally convergent for an arbitrary number of component functions without artificial duplication of variables. We report results from extensive numerical experiments in two statistical learning settings such as large-scale overlapping group Lasso and doubly regularized support vector machine. In each setting, we introduce novel and efficient solutions for solving sub-problems. The numerical results demonstrate the efficacy and accuracy of SLIN.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have