Abstract

We study the problem of multivariate estimation in the nonparametric regression model with random design. We assume that the regression function to be estimated possesses partially linear structure, where parametric and nonparametric components are both unknown. Based on Goldenshulger and Lepski methodology, we propose estimation procedure that adapts to the smoothness of the nonparametric component, by selecting from a family of specific kernel estimators. We establish a global oracle inequality (under the Lp-norm, 1≤p<1) and examine its performance over the anisotropic H¨older space.

Highlights

  • We observe (X1, Y1), . . . , (Xn, Yn) ∈ Rd × R satisfyingYi = g(Xi) + ζi, i = 1, ..., n (1)where d ≥ 2, g is an unknown function from [0, 1]d to R, the design points {Xi}ni=1 are i.i.d. random variables uniformly distributed on [0, 1]d and the noise {ζi}ni=1 are i.i.d. centered real random variables having symmetric distribution

  • We study the problem of multivariate estimation in the nonparametric regression model with random design

  • We assume that the regression function to be estimated possesses partially linear structure, where parametric and nonparametric components are both unknown

Read more

Summary

Introduction

We establish global oracle inequality and show how to use it for deriving minimax adaptive results if f belongs to anisotropic Holder space. Let L and K be two kernel functions defined respectively on Rd1 and Rd2 with values in R satisfying the following assumptions. Our selection rule uses auxiliary estimators that are constructed as follows: for h, η ∈ Hn, define the kernel Kh Kη by Let τ = (τ1, ..., τd2 ) where τi = log d2 (n) and consider the following notations: f∞ = fτ ∞ + 5 fτ ∞ = sup | fτ(x)|

Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call