Abstract

Predictive recursion (PR) is a fast stochastic algorithm for nonparametric estimation of mixing distributions in mixture models. It is known that the PR estimates of both the mixing and mixture densities are consistent under fairly mild conditions, but currently very little is known about the rate of convergence. Here I first investigate asymptotic convergence properties of the PR estimate under model misspecification in the special case of finite mixtures with known support. Tools from stochastic approximation theory are used to prove that the PR estimates converge, to the best Kullback–Leibler approximation, at a nearly root- n rate. When the support is unknown, PR can be used to construct an objective function which, when optimized, yields an estimate of the support. I apply the known-support results to derive a rate of convergence for this modified PR estimate in the unknown support case, which compares favorably to known optimal rates.

Highlights

  • Nonparametric estimation of mixing distributions is an important and challenging problem in statistics. Recent progress along these lines has been made with the fast stochastic predictive recursion (PR) algorithm due to Newton et al (1998) and Newton (2002)

  • We shall confine ourselves here to an analysis of PR when the possibly misspecified model assumes that the data-generating distribution is a finite mixture with known support

  • We prove that the PR estimate of the mixing distribution converges almost surely at a nearly parametric root-n rate, where the limit is characterized by the mixture model closest to the true data-generating distribution based on the Kullback–Leibler divergence

Read more

Summary

Introduction

Nonparametric estimation of mixing distributions is an important and challenging problem in statistics. We shall confine ourselves here to an analysis of PR when the possibly misspecified model assumes that the data-generating distribution is a finite mixture with known support In this case, we prove that the PR estimate of the mixing distribution converges almost surely at a nearly parametric root-n rate, where the limit is characterized by the mixture model closest to the true data-generating distribution based on the Kullback–Leibler divergence. We prove that the PR estimate of the mixing distribution converges almost surely at a nearly parametric root-n rate, where the limit is characterized by the mixture model closest to the true data-generating distribution based on the Kullback–Leibler divergence This result sheds light on how one should choose PR’s tuning parameter in practical applications. Two numerical examples are given to illustrate the method; for more examples and the full computational details, the reader is referred to Martin (2011)

Predictive recursion
Asymptotics for PR with known support
PR with unknown support
Large-sample theory
Examples
A Convergence rates for stochastic approximation
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call