Abstract

We shall present here a general study of minimum contrast estimators in a nonparametric setting (although our results are also valid in the classical parametric case) for independent observations. These estimators include many of the most popular estimators in various situations such as maximum likelihood estimators, least squares and other estimators of the regression function, estimators for mixture models or deconvolution... The main theorem relates the rate of convergence of those estimators to the entropy structure of the space of parameters. Optimal rates depending on entropy conditions are already known, at least for some of the models involved, and they agree with what we get for minimum contrast estimators as long as the entropy counts are not too large. But, under some circumstances (“large” entropies or changes in the entropy structure due to local perturbations), the resulting the rates are only suboptimal. Counterexamples are constructed which show that the phenomenon is real for non-parametric maximum likelihood or regression. This proves that, under purely metric assumptions, our theorem is optimal and that minimum contrast estimators happen to be suboptimal.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call