Abstract

Assume that $X_1, \cdots, X_p$ are independent random observations having discrete exponential densities $\rho_i(\theta_i)t_i(x_i)\theta^{xi}_i, i = 1, \cdots, p$ respectively. A general technique of improving upon the uniform minimum variance unbiased estimator (UMVUE) of $(\theta_1, \cdots, \theta_p)$ is developed under possibly weighted squared error loss functions. It is shown that improved estimators can be constructed by solving a difference inequality. Typical difference inequalities of a fairly general type are presented and solved. When specialized to Poisson and Negative binomial cases, broad classes of estimators are given that dominate the UMVUE. These results unify many known results in this rapidly diverging field, and some of them are new (especially those related to Negative Binomial distributions). Improved estimators are also obtained for the problems in which some of the observations are from Poisson families and some from Negative Binomial families. For sum of squared errors loss, estimators which dominate the UMVUE in the discrete exponential families are also given explicitly.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.