Abstract

Inference after variable selection is a very important problem. This paper derives the asymptotic distribution of many variable selection estimators, such as forward selection and backward elimination, when the number of predictors is fixed. Under strong regularity conditions, the variable selection estimators are asymptotically normal, but generally the asymptotic distribution is a nonnormal mixture distribution. The theory shows that the lasso variable selection and elastic net variable selection estimators are consistent estimators of when lasso and elastic net are consistent estimators of A bootstrap technique to eliminate selection bias is to fit the variable selection estimator to a bootstrap sample to find a submodel, then draw another bootstrap sample and fit the same submodel to get the bootstrap estimator Bootstrap confidence regions were used for hypothesis testing.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call