Abstract

This chapter discusses a model that presents assumptions about the family of distributions to which the parent distribution belongs. The statistical model specifies the possible range of the parameters, called the parameter space and the corresponding family of distributions. Every sample contains a certain amount of information on the parent distribution. The larger the number of observations in the sample is the more information it contains on the distribution under consideration. One can start with the investigation of the question whether the sample data can be condensed by computing first the values of certain statistics, without losing information. If such statistics exist, they are called sufficient statistics. The term statistic will be used to indicate a function of the random variables, which does not involve the unknown parameters or any other unknown characteristic of their distributions. The chapter discusses two types of information functions used in statistical analysis: (1) the Fisher information function and (2) the Kullback–Leibler information function. These two information functions are somewhat related but designed to fulfill different roles. The Fisher information function is applied in various estimation problems, while the Kullback–Leibler information function has direct applications in the theory of testing hypotheses.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.