Abstract

This paper addresses the largest and the smallest observations, at the times when a new record of either kind (upper or lower) occurs, which are it called the current upper and lower record, respectively. We examine the entropy properties of these statistics, especially the difference between entropy of upper and lower bounds of record coverage. The results are presented for some common parametric families of distributions. Several upper and lower bounds, in terms of the entropy of parent distribution, for the entropy of current records are obtained. It is shown that mutual information, as well as Kullback–Leibler distance between the endpoints of record coverage, Kullback–Leibler distance between data distribution, and current records, are all distribution-free.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.