Abstract

It is well known that central order statistics exhibit a central limit behavior and converge to a Gaussian distribution as the sample size grows. This paper strengthens this known result by establishing an entropic version of the central limit theorem that ensures a stronger mode of convergence using the relative entropy. This upgrade in convergence is shown at the expense of extra regularity conditions, which can be considered as mild. To prove this result, ancillary results on order statistics are derived, which might be of independent interest. For instance, a rather general bound on the moments of order statistics, and an upper bound on the mean squared error of estimating the <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$p \in (0,1)$ </tex-math></inline-formula> -th quantile of an unknown cumulative distribution function, are derived. Finally, a discussion on the necessity of the derived conditions for convergence and on the rate of convergence and monotonicity of the relative entropy is provided.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call