Abstract

The properties of adaptive non-parametric kernel estimators for the multivariate probability density f(x) (and its derivatives) of identically distributed random vectors ∈n, n ≥ 1 at a given point are studied. It is supposed that the vectors ∈n, n ≥ 1 form a martingale-difference process (∈n)n≤1 and the function to be estimated belongs to a class of densities slightly narrower than the class of densities with the following condition on the highest derivatives of the order ν: where Δ(t), t ≤ 0, is some positive, bounded from above, monotonously increasing for t, small enough unknown function.An asymptotic mean square criterion is proposed. The optimality, in asymptotically minimax sense of adaptive estimators of density derivatives, is proved for a class of the Bartlett kernel estimators with a random data-driven bandwidth.It's well-known that the optimization of the asymptotic value of the mean squared error for the Bartlett kernel density estimators leads to the optimal bandwidth depending on unknown functions. Therefore it is not quite simple to apply these estimators to practice.The paper proposes an adaptive approach to this problem, which is based on the idea of changing the unknown functions in optimal bandwidth by a sequence of estimators converging to the unknown values of these functions. It is shown, that the constructed adaptive kernel estimators keep all the asymptotic properties of the sharp-optimal non-adaptive Bartlett estimators.An example of the adaptive estimator, optimal in the sense of the introduced criterion is considered. This estimator has simple structure and may be easily used in real statistical problems. The proposed estimators possess the property of uniform asymptotic normality and almost sure convergence.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call