Abstract
For grouped covariates, we propose a framework for boosting that allows for sparsity within and between groups. By using component-wise and group-wise gradient ridge boosting simultaneously with adjusted degrees of freedom or penalty parameters, a model with similar properties as the sparse-group lasso can be fitted through boosting. We show that within-group and between-group sparsity can be controlled by a mixing parameter, and discuss similarities and differences to the mixing parameter in the sparse-group lasso. Furthermore, we show under which conditions variable selection on a group or individual variable basis happens and provide selection bounds for the regularization parameters depending solely on the singular values of the design matrix in a boosting iteration of linear Ridge penalized boosting. In special cases, we characterize the selection chance of an individual variable versus a group of variables through a generalized beta prime distribution. With simulations as well as two real datasets from ecological and organizational research data, we show the effectiveness and predictive competitiveness of this novel estimator. The results suggest that in the presence of grouped variables, sparse-group boosting is associated with less biased variable selection and higher predictability compared to component-wise or group-component-wise boosting.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.