Abstract

Ranked set sampling and some of its variants have been applied successfully in different areas of applications such as industrial statistics, economics, environmental and ecological studies, biostatistics, and statistical genetics. Ranked set sampling is a sampling method that more efficient than simple random sampling. Also, it is well known that Fisher information of a ranked set sample (RSS) is larger than Fisher information of a simple random sample (SRS) of the same size about the unknown parameter of the underlying distribution in parametric inference. In this paper, we consider the Farlie-Gumbel-Morgenstern (FGM) family and study the information measures such as Shannon’s entropy, Rényi entropy, mutual information, and Kullback-Leibler (KL) information of RSS data. Also, we investigate their properties and compare them with a SRS data.

Highlights

  • McIntyre (1952) first proposed ranked set sampling for estimating the mean pasture yields and indicated that ranked set sampling is a more efficient sampling method than simple random sampling method for estimating the population mean

  • Ranking of the units is done with a low-level measurement such as using previous experiences, visual measurement or using a concomitant variable

  • Stokes (1977) applied ranked set sampling for bivariate random variable (X,Y), where X is the variable of interest and Y is a concomitant variable that is not of direct interest but is relatively easy to measure

Read more

Summary

Introduction

McIntyre (1952) first proposed ranked set sampling for estimating the mean pasture yields and indicated that ranked set sampling is a more efficient sampling method than simple random sampling method for estimating the population mean. Stokes (1977) applied ranked set sampling for bivariate random variable (X,Y), where X is the variable of interest and Y is a concomitant variable that is not of direct interest but is relatively easy to measure. The procedure of ranked set sampling described by Stokes (1977) for a bivariate random variable is as follows: Step 1. In FGM family, we study the information measures such as Shannon's entropy, Rényi entropy, mutual information, and Kullback-Leibler (KL) information of RSS data. Abo-Eleneen (2001) and Abo-Eleneen and Nagaraja (2002a) studied Fisher information in pairs and collections of order statistics and their concomitants from bivariate samples. Tahmasebi and Behboodian (2012) obtained some results of information measures for concomitants of order statistics in FGM family.

Shannon entropy of RSS in FGM family
Rényi entropy of RSS in FGM family
Mutual information of RSS in FGM family
Kullback-Leibler information of RSS in FGM family
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.