Abstract

We consider a two-stage estimation method for linear regression. First, it uses the lasso in Tibshirani to screen variables and, second, re-estimates the coefficients using the least-squares boosting method in Friedman on every set of selected variables. Based on the large-scale simulation experiment in Hastie, Tibshirani, and Tibshirani, lassoed boosting performs as well as the relaxed lasso in Meinshausen and, under certain scenarios, can yield a sparser model. Applied to predicting equity returns, lassoed boosting gives the smallest mean-squared prediction error compared to several other methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call