Abstract

SummaryThere has been much recent work on inference after model selection in situations where the noise level is known. However, the error variance is rarely known in practice and its estimation is difficult in high-dimensional settings. In this work we propose using the square-root lasso, also known as the scaled lasso, to perform inference for selected coefficients and the noise level simultaneously. The square-root lasso has the property that the choice of a reasonable tuning parameter does not depend on the noise level in the data. We provide valid $p$-values and confidence intervals for coefficients after variable selection and estimates for the model-specific variance. Our estimators perform better in simulations than other estimators of the noise variance. These results make inference after model selection significantly more applicable.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call