Abstract
The elastic net is a regularized least squares regression method that has been widely used in learning and variable selection. The elastic net regularization linearly combines an l1 penalty term (like the lasso) and an l2 penalty term (like ridge regression). The l1 penalty term enforces sparsity of the elastic net estimator, whereas the l2 penalty term ensures democracy among groups of correlated variables. Compressed sensing is currently an extensively studied technique for efficiently reconstructing a sparse vector from much fewer samples/observations. In this paper we study the elastic net in the setting of sparse vector recovery. For recovering sparse vectors from few observations by employing the elastic net regression, we prove in this paper that the elastic net estimator is stable provided that the underlying measurement/design matrix satisfies the commonly required restricted isometry property or the sparse approximation property. It is well known that many independent random measurement matrices satisfy the restricted isometry property while random measurement matrices generated by highly correlated Gaussian random variables satisfy the sparse approximation property. As a byproduct, we establish a uniform bound for the grouping effect of the elastic net. Some numerical experiments are provided to illustrate our theoretical results on stability and grouping effect of the elastic net estimator.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.