Abstract

Varying coefficient models are useful generalizations of parametric linear models. They allow for parameters that depend on a covariate or that develop in time. They have a wide range of applications in time series analysis and regression. In time series analysis they have turned out to be a powerful approach to infer on behavioral and structural changes over time. In this paper, we are concerned with high dimensional varying coefficient models including the time varying coefficient model. Most studies in high dimensional nonparametric models treat penalization of series estimators. On the other side, kernel smoothing is a well established, well understood and successful approach in nonparametric estimation, in particular in the time varying coefficient model. But not much has been done for kernel smoothing in high-dimensional models. In this paper we will close this gap and we develop a penalized kernel smoothing approach for sparse high-dimensional models. The proposed estimators make use of a novel penalization scheme working with kernel smoothing. We establish a general and systematic theoretical analysis in high dimensions. This complements recent alternative approaches that are based on basis approximations and that allow more direct arguments to carry over insights from high-dimensional linear models. Furthermore, we develop theory not only for regression with independent observations but also for local stationary time series in high-dimensional sparse varying coefficient models. The development of theory for local stationary processes in a high-dimensional setting creates technical challenges. We also address issues of numerical implementation and of data adaptive selection of tuning parameters for penalization.The finite sample performance of the proposed methods is studied by simulations and it is illustrated by an empirical analysis of NASDAQ composite index data.

Highlights

  • Been considered mostly only for finite dimensional models

  • This paper closes a gap in recent interests in sparse high dimensional nonparametric regression

  • In this paper we have developed a penalized estimation method based on kernel smoothing

Read more

Summary

Model and methodology

We suppose that the data {(Xi, Zi, Yi), 1 ≤ i ≤ n} are generated under the model p. We consider estimation of m0j and (m0j )(1), 1 ≤ j ≤ p for sparse high dimensional varying coefficient models where sparsity is defined on a functional level (in L2 sense). We propose the BIC in (2.5) primarily for selecting λ2 in our penalization (2.3), the idea applies to a direct selection problem of the index sets V and I of varying and non-varying coefficient functions. This is done by minimizing the following BIC: BIC(V, I) = log L(m V,I , m (V1,)I ) + Cn. Here the minimum runs over subsets V and I of {1, . We will show consistency of the proposed BICs in (2.5) and (2.6) in S ection 3.4

Oracle inequality
Consistency and inconsistency of group LASSO estimators
Oracle properties
Consistent identification of BIC
Numerical implementation
Model identification and estimation of penalized methods
Consistency of BIC in semiparametric model identification
A data example
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.