Abstract

We consider Pitman closeness domination in predictive density estimation problems when the underlying loss metric is $$\alpha$$ -divergence, $$\{D(\alpha )\}$$ , a loss introduced by Csiszar (Stud Sci Math Hung 2:299–318, 1967). The underlying distributions considered are normal location-scale models, including the distribution of the observables, the distribution of the variable, whose density is to be predicted, and the estimated predictive density which will be taken to be of the plug-in type. The scales may be known or unknown. Chang and Strawderman (J Multivar Anal 128:1–9, 2014) have derived a general expression for the $$\alpha$$ -divergence loss in this setup, and have shown that it is a concave monotone function of quadratic loss, and also a function of the variances (predicand, and plug-in). We demonstrate $$\{D(\alpha )\}$$ Pitman closeness domination of certain plug-in predictive densities over others for the entire class of metrics simultaneously when modified Pitman closeness domination holds in the related problem of estimating the mean. We also establish $$\{D(\alpha )\}$$ Pitman closeness results for certain generalized Bayesian (best invariant) predictive density estimators. Examples of $$\{D(\alpha )\}$$ Pitman closeness domination presented relate to the problem of estimating the predictive density of the variable with the larger mean. We also consider the case of two-ordered normal means with a known covariance matrix.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call