Abstract

Let f(y∣θ),θ∈Ω be a parametric family, η(θ) a given function, and G an unknown mixing distribution. It is desired to estimate EG(η(θ))≡ηG based on independent observations Y1,...,Yn, where Yi∼f(y∣θi), and θi∼G are iid. We explore the Generalized Maximum Likelihood Estimators (GMLE) for this problem. Some basic properties and representations of those estimators are shown. In particular we suggest a new perspective, of the weak convergence result by [14], with implications to a corresponding setup in which θ1,...,θn are fixed parameters. We also relate the above problem, of estimating ηG, to nonparametric empirical Bayes estimation under a squared loss. Applications of GMLE to sampling problems are presented. The performance of the GMLE is demonstrated both in simulations and through a real data example.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call