Abstract

Let $\hat{T}_n$ be an estimate of the form $\hat{T}_n = T(\hat{F}_n)$, where $\hat{F}_n$ is the sample $\operatorname{cdf}$ of $n \operatorname{iid}$ observations and $T$ is a locally quadratic functional defined on $\operatorname{cdf's}$. Then, the normalized jackknife estimates for bias, skewness, and variance of $\hat{T}_n$ approximate closely their bootstrap counterparts. Each of these estimates is consistent. Moreover, the jackknife and bootstrap estimates of variance are asymptotically normal and asymptotically minimax. The main results: the first-order Edgeworth expansion estimate for the distribution of $n^{1/2}(\hat{T}_n - T(F))$, with $F$ being the actual $\operatorname{cdf}$ of each observation and the expansion coefficients being estimated by jackknifing, is asymptotically equivalent to the corresponding bootstrap distribution estimate, up to and including terms of order $n^{-1/2}$. Both distribution estimates are asymptotically minimax. The jackknife Edgeworth expansion estimate suggests useful corrections for skewness and bias to upper and lower confidence bounds for $T(F)$.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.