Abstract

In partially linear models, we consider methodology for simultaneous model selection and parameter estimation with random coefficient autoregressive errors by using lasso and shrinkage strategies. We provide natural adaptive estimators that significantly improve upon the classical procedures in the situation where some of the predictors are nuisance variables that may or may not affect the association between the response and the main predictors. In the context of two competing partially linear regression models (full and submodels), we consider an adaptive shrinkage estimation strategy and propose the shrinkage estimator and the positive‐rule shrinkage estimator. We develop the properties of these estimators by using the notion of asymptotic distributional risk. The shrinkage estimators are shown to have a higher efficiency than the classical estimators for a wide class of models. For the lasso‐type estimation strategy, we devise efficient algorithms to obtain numerical results. We compare the relative performance of lasso with the shrinkage estimator and the other estimators. Monte Carlo simulation experiments are conducted for various combinations of the nuisance parameters and sample size, and the performance of each method is evaluated in terms of simulated mean squared error. The comparison reveals that lasso and shrinkage strategies outperform the classical procedure. The relative performance of lasso and shrinkage strategies is comparable. The shrinkage estimators perform better than the lasso strategy in the effective part of the parameter space when, and only when, there are many nuisance variables in the model. A data example is showcased to illustrate the usefulness of suggested methods. Copyright © 2011 John Wiley & Sons, Ltd.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call