Abstract

Penalized estimation principle is fundamental to high-dimensional problems. In the literature, it has been extensively and successfully applied to various models with only structural parameters. As a contrast, in this paper, we first apply this penalization principle to a linear regression model with a finite-dimensional vector of structural parameters and a high-dimensional vector of sparse incidental parameters. For the estimators of the structural parameters, we derive their consistency and asymptotic normality, which reveals an oracle property. However, the penalized estimators for the incidental parameters possess only partial selection consistency but not consistency. This is an interesting partial consistency phenomenon: the structural parameters are consistently estimated while the incidental ones cannot. For the structural parameters, also considered is an alternative two-step penalized estimator, which has fewer possible asymptotic distributions and thus is more suitable for statistical inferences. We further extend the methods and results to the case where the dimension of the structural parameter vector diverges with but slower than the sample size. A data-driven approach for selecting a penalty regularization parameter is provided. The finite-sample performance of the penalized estimators for the structural parameters is evaluated by simulations and a real data set is analyzed.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call