Abstract

Let X be an observable random variable with unknown distribution function F(x) = mathbb{P}(X leq x), - infty< x < infty, and let \t\t\tθ=sup{r≥0:E|X|r<∞}.\\documentclass[12pt]{minimal}\t\t\t\t\\usepackage{amsmath}\t\t\t\t\\usepackage{wasysym}\t\t\t\t\\usepackage{amsfonts}\t\t\t\t\\usepackage{amssymb}\t\t\t\t\\usepackage{amsbsy}\t\t\t\t\\usepackage{mathrsfs}\t\t\t\t\\usepackage{upgreek}\t\t\t\t\\setlength{\\oddsidemargin}{-69pt}\t\t\t\t\\begin{document}$$\\theta= \\sup\\bigl\\{ r \\geq0: \\mathbb{E} \\vert X \\vert ^{r} < \\infty\\bigr\\} . $$\\end{document} We call θ the power of moments of the random variable X. Let X_{1}, X_{2}, ldots, X_{n} be a random sample of size n drawn from F(cdot). In this paper we propose the following simple point estimator of θ and investigate its asymptotic properties: \t\t\tθˆn=lognlogmax1≤k≤n|Xk|,\\documentclass[12pt]{minimal}\t\t\t\t\\usepackage{amsmath}\t\t\t\t\\usepackage{wasysym}\t\t\t\t\\usepackage{amsfonts}\t\t\t\t\\usepackage{amssymb}\t\t\t\t\\usepackage{amsbsy}\t\t\t\t\\usepackage{mathrsfs}\t\t\t\t\\usepackage{upgreek}\t\t\t\t\\setlength{\\oddsidemargin}{-69pt}\t\t\t\t\\begin{document}$$\\hat{\\theta}_{n} = \\frac{\\log n}{\\log\\max_{1 \\leq k \\leq n} \\vert X_{k} \\vert }, $$\\end{document} where log x = ln(e vee x), - infty< x < infty. In particular, we show that \t\t\tθˆn→Pθif and only iflimx→∞xrP(|X|>x)=∞∀r>θ.\\documentclass[12pt]{minimal}\t\t\t\t\\usepackage{amsmath}\t\t\t\t\\usepackage{wasysym}\t\t\t\t\\usepackage{amsfonts}\t\t\t\t\\usepackage{amssymb}\t\t\t\t\\usepackage{amsbsy}\t\t\t\t\\usepackage{mathrsfs}\t\t\t\t\\usepackage{upgreek}\t\t\t\t\\setlength{\\oddsidemargin}{-69pt}\t\t\t\t\\begin{document}$$\\hat{\\theta}_{n} \\rightarrow_{\\mathbb{P}} \\theta\\quad\\mbox{if and only if}\\quad\\lim_{x \\rightarrow\\infty} x^{r} \\mathbb{P}\\bigl( \\vert X \\vert > x\\bigr) = \\infty\\quad\\forall r > \\theta. $$\\end{document} This means that, under very reasonable conditions on F(cdot), hat {theta}_{n} is actually a consistent estimator of θ.

Highlights

  • We call θ the power of moments of the random variable X

  • Let X1, X2, . . . , Xn be a random sample of size n drawn from the random variable X; i.e., X1, X2, . . . , Xn are independent and identically distributed (i.i.d.) random variables whose common distribution function is F(·)

  • 4 Conclusions In this paper we propose the following simple point estimator of θ, the power of moments of the random variable X, and investigate its asymptotic properties: θn log log n max1≤k≤n

Read more

Summary

Introduction

There exists an increasing positive integer sequence {ln; n ≥ 1} (which depends on the probability distribution of X when ρ1 < ∞) such that lim log max1≤k≤ln Xk = 1 a.s. log ln ρ1 (2.2) There exists an increasing positive integer sequence {mn; n ≥ 1} (which depends on the probability distribution of X when ρ2 > 0) such that lim log max1≤k≤mn Xk = 1 a.s. log mn ρ2 (2.4) 1 = ∞, 2n n=1 n=1 n=1 it follows from the Borel–Cantelli lemma that lim sup Un = 3 n→∞

Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.