Abstract

In this paper we consider estimation of the location parameter $\theta \in R^d$ based on a random sample from $(\theta + X, Y),$ where $X$ is a $d$-dimensional random vector, $Y$ is a random element of some measure space $\mathscr{Y},$ and $(X, Y)$ has a known distribution. We first define the Fisher information $\mathscr{J}(\theta + X, Y)$ and the inverse information $\mathscr{J}^-(\theta + X, Y)$ under no regularity conditions. The properties of these quantities are investigated. Supposing that $E|X|^\delta < \infty$ for some $\delta > 0$ we show that for $n$ sufficiently large the Pitman estimator $\hat{\theta}_n$ of $\theta$ based on a random sample of size $n$ is well defined, unbiased, and its covariance, which is independent of $\theta$, satisfies the inequality $n \operatorname{Cov} \hat{\theta}_n \geqq \mathscr{J}^-(\theta + X, Y)$. Moreover, $\lim_{n\rightarrow \infty} n \operatorname{Cov} \hat{\theta}_n = \mathscr{J}^-(\theta + X, Y)$ and $n^\frac{1}{2}(\hat{\theta}_n - \theta)$ is asymptotically normal with mean zero and covariance $\mathscr{J}^-(\theta + X, Y)$.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.