Abstract

When outliers and/or heavy-tailed errors exist in linear models, the least absolute deviation (LAD) regression is a robust alternative to the ordinary least squares regression. Existing variable-selection methods in linear models based on LAD regression either only consider the finite number of predictors or lack the oracle property associated with the estimator. In this article, we focus on the variable selection via LAD regression with a diverging number of parameters. The rate of convergence of the LAD estimator with the smoothly clipped absolute deviation (SCAD) penalty function is established. Furthermore, we demonstrate that, under certain regularity conditions, the penalized estimator with a properly selected tuning parameter enjoys the oracle property. In addition, the rank correlation screening method originally proposed by Li et al. (2011) is applied to deal with ultrahigh dimensional data. Simulation studies are conducted for revealing the finite sample performance of the estimator. We further illustrate the proposed methodology by a real example.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call