Abstract

In this paper, we investigate the sparse group feature selection problem, in which covariates posses a grouping structure sparsity at the level of both features and groups simultaneously. We reformulate the feature sparsity constraint as an equivalent weighted $l_1$ -norm constraint in the sparse group optimization problem. To solve the reformulated problem, we first propose a weighted thresholding method based on a dynamic programming algorithm. Then we improve the method to a weighted thresholding homotopy algorithm using homotopy technique. We prove that the algorithm converges to an $L$ -stationary point of the original problem. Computational experiments on synthetic data show that the proposed algorithm is competitive with some state-of-the-art algorithms.

Highlights

  • In this paper, we are interested in sparse group feature selection, which simultaneously selects important groups as well as important individual variables

  • To solve the reformulated problem, we penalize the side constraint and propose a weighted thresholding method, which is based on a dynamic programming algorithm

  • To improve the performance of the weighted thresholding method, we develop a weighted thresholding homotopy method for the problem based on the homotopy technique

Read more

Summary

INTRODUCTION

We are interested in sparse group feature selection, which simultaneously selects important groups as well as important individual variables. Many works have been devoted to group sparse optimization, such as group Lasso [6], [7], which uses an l2-regularization for each group This kind of methods is incapable of variable selection at the individual level. Based on a new class of group penalties, the group exponential Lasso proposed in [10] allows the penalty to decay exponentially for selecting individual features Another approach to problem (1) is the iterative sparse group hard thresholding algorithm proposed in [1], in which f (x) is approximated by a proximal function. To achieve this purpose, we reformulate the sf -sparse individual constraint in (1) by a weighted l1-norm strategy and get a new problem, which is equivalent to problem (1).

EQUIVALENT FORMULATION
DYNAMIC PROGRAMMING ALGORITHM
COMPUTATIONAL EXPERIMENTS
Findings
CONCLUSION
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call