Abstract

This paper is concerned with prediction for Gamma models, and more specifically the estimation of a predictive density for Y∼Ga(α2,β) under Kullback–Leibler loss, based on X∼Ga(α1,β). The main focus pertains to situations where there is a parametric constraint of the form β∈C=(a,b). We obtain representations for Bayes predictive densities and the minimum risk equivariant predictive density in the unconstrained problem. It is shown that the generalized Bayes estimator against the truncation of the non-informative prior onto C dominates the minimum risk equivariant predictive density and is minimax whenever a=0 or b=∞. Analytical comparisons of plug-in predictive densities Ga(α2,βˆ), which include the predictive mle density, are obtained with results applying as well for point estimation under dual entropy loss ββˆ−log(ββˆ)−1. Numerical evaluations confirm that such predictive densities are much less efficient than some Bayesian alternatives in exploiting the parametric restriction. Finally, it is shown that variance expansion improvements of the form Ga(α2k,kβˆ) of plug-in predictive densities can always be found for a subset of k>1 values and non-degenerate βˆ.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call