Abstract

Based on a data-driven selection of an estimator from a fixed family of kernel estimators, Goldenshluger and Lepski (Probab Theory Relat Fields 159:479–543, 2014) considered the problem of adaptive minimax un-compactly supported density estimation on \({\mathbb {R}}^{d}\) with \(L^{p}\) risk over Nikol’skii classes. This paper shows the same convergence rates by using a data-driven wavelet estimator over Besov spaces, because the wavelet estimations provide more local information and fast algorithm. Moreover, we explore better convergence rates under the independence hypothesis, which reduces the dimension disaster effectively.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call