Abstract

Consider a multiple linear regression in which $Y_i, i = 1, \cdots, n$, are independent normal variables with variance $\sigma^2$ and $E(Y_i) = \alpha + V'_i\beta$, where $V_i \in \mathbb{R}^r$ and $\beta \in \mathbb{R}^r.$ Let $\hat{\alpha}$ denote the usual least squares estimator of $\alpha$. Suppose that $V_i$ are themselves observations of independent multivariate normal random variables with mean 0 and known, nonsingular covariance matrix $\theta$. Then $\hat{\alpha}$ is admissible under squared error loss if $r \geq 2$. Several estimators dominating $\hat{\alpha}$ when $r \geq 3$ are presented. Analogous results are presented for the case where $\sigma^2$ or $\theta$ are unknown and some other generalizations are also considered. It is noted that some of these results for $r \geq 3$ appear in earlier papers of Baranchik and of Takada. $\{V_i\}$ are ancillary statistics in the above setting. Hence admissibility of $\hat{\alpha}$ depends on the distribution of the ancillary statistics, since if $\{V_i\}$ is fixed instead of random, then $\hat{\alpha}$ is admissible. This fact contradicts a widely held notion about ancillary statistics; some interpretations and consequences of this paradox are briefly discussed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call