Abstract

Group variable selection is an important issue in high-dimensional data modeling and most of existing methods consider only the linear model. Therefore, a new method based on the deep neural network (DNN), an increasingly popular nonlinear method in both statistics and deep learning communities, is proposed. The method is applicable to general nonlinear models, including the linear model as a special case. Specifically, a group sparse neural network (GSNN) is designed, where the definition of nonlinear group high-level features (NGHFs) is generalized to the network structure. A two-stage group sparse (TGS) algorithm is employed to induce group variables selection by performing group structure selection on the network. GSNN is promising for complex nonlinear systems with interactions and correlated predictors, overcoming the shortcomings of linear or marginal variable selection methods. Theoretical results on convergence and group-level selection consistency are also given. Simulations results and real data analysis demonstrate the superiority of our method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call