Abstract

In a linear regression model, the Dantzig selector (Candès and Tao, 2007) minimizes the L1 norm of the regression coefficients subject to a bound λ on the L∞ norm of the covariances between the predictors and the residuals; the resulting estimator is the solution of a linear program, which may be nonunique or unstable. We propose a regularized alternative to the Dantzig selector. These estimators (which depend on λ and an additional tuning parameter r) minimize objective functions that are the sum of the L1 norm of the regression coefficients plus r times the logarithmic potential function of the Dantzig selector constraints, and can be viewed as penalized analytic centers of the latter constraints. The tuning parameter r controls the smoothness of the estimators as functions of λ and, when λ is sufficiently large, the estimators depend approximately on r and λ via r/λ2.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.