Abstract

Huang et al. [J. Huang, S. Ma, and C.-H. Zhang, Adaptive Lasso for sparse high-dimensional regression models, Statist. Sinica 18 (2008), pp. 1603–1618] have studied the asymptotic properties of the adaptive Lasso estimators in sparse, high-dimensional, linear regression models when the number of covariates may increase with the sample size. They proved that the adaptive Lasso has an oracle property in the sense of Fan and Li [J. Fan and R. Li, Variable selection via nonconcave penalized likelihood and its oracle properties, J. Am. Statist. Assoc. 96 (2001), pp. 1348–1360] and Fan and Peng [J. Fan and H. Peng, Nonconcave penalized likelihood with a diverging number of parameters, Ann. Statist. 32 (2004), pp. 928–961] under appropriate conditions. Particularly, they assumed that the errors of the linear regression model have Gaussian tails. In this paper, we relax this condition and assume that the errors have the finite 2kth moment for an integer k>0. With this assumption, we prove that the adaptive Lasso also has the oracle property under some appropriate conditions. Simulations are carried out to provide understanding of our result.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call