Abstract

Bayes estimation of the mean of a variance mixture of multivariate normal distributions is considered under sum of squared errors loss. We find broad class of priors (also in the variance mixture of normal class) which result in proper and generalized Bayes minimax estimators. This paper extends the results of Strawderman [Minimax estimation of location parameters for certain spherically symmetric distribution, J. Multivariate Anal. 4 (1974) 255–264] in a manner similar to that of Maruyama [Admissible minimax estimators of a mean vector of scale mixtures of multivariate normal distribution, J. Multivariate Anal. 21 (2003) 69–78] but somewhat more in the spirit of Fourdrinier et al. [On the construction of bayes minimax estimators, Ann. Statist. 26 (1998) 660–671] for the normal case, in the sense that we construct classes of priors giving rise to minimaxity. A feature of this paper is that in certain cases we are able to construct proper Bayes minimax estimators satisfying the properties and bounds in Strawderman [Minimax estimation of location parameters for certain spherically symmetric distribution, J. Multivariate Anal. 4 (1974) 255–264]. We also give some insight into why Strawderman's results do or do not seem to apply in certain cases. In cases where it does not apply, we give minimax estimators based on Berger's [Minimax estimation of location vectors for a wide class of densities, Ann. Statist. 3 (1975) 1318–1328] results. A main condition for minimaxity is that the mixing distributions of the sampling distribution and the prior distribution satisfy a monotone likelihood ratio property with respect to a scale parameter.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call