Abstract

Here we explore general asymptotic properties of Predictive Recursion (PR) for nonparametric estimation of mixing distributions. We prove that, when the mixture model is mis-specified, the estimated mixture convergesalmost surely in total variation to the mixture that minimizes the Kullback-Leiblerdivergence,and a bound on the (Hellingercontrast)rate of convergence is obtained. Simulations suggest that this rate is nearly sharp in a minimax sense. Moreover, when the model is identifiable, almost sure weak convergence of the mixing distribution estimate follows. PR assumes that the support of the mixing distribution is known. To remove this requirement, we propose a generalization that incorporates a sequence of supports, increasing with the sample size, that combines the efficiency of PR with the flexibility of mixture sieves. Under mild conditions, we obtain a bound on the rate of convergence of these new estimates. AMS 2000 subject classifications: Primary 62G20; secondary 62G05, 62G07, 62G35. Keywords and phrases: Almost supermartingale, density estimation, empirical Bayes, Kullback-Leibler projection, mixture models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call