Abstract

It is well-known that some information measures, including Fisher information and entropy, can be represented in terms of the hazard function. In this paper, we provide the representations of more information measures, including quantal Fisher information and quantal Kullback-leibler information, in terms of the hazard function and reverse hazard function. We provide some estimators of the quantal KL information, which include the Anderson-Darling test statistic, and compare their performances.

Highlights

  • Suppose that X is a random variable with a continuous probability density function (p.d.f.) f ( x; θ ), where θ is a real-valued scalar parameter. It is well-known that the Fisher information plays an important role in statistical estimation and inference, which is defined as Citation: Park, S

  • We show that the quantal Fisher information can be expressed in terms of both hazard function and reversed hazard function, as follows

  • It is well-known that both Fisher information and Kullback-Leibler information can be in terms of the hazard function or reverse hazard function

Read more

Summary

Introduction

Suppose that X is a random variable with a continuous probability density function (p.d.f.) f ( x; θ ), where θ is a real-valued scalar parameter It is well-known that the Fisher information plays an important role in statistical estimation and inference, which is defined as Citation: Park, S. R f (x) h f (x) where r f ( x ) and r g ( x ) are the reverse hazard functions defined as f ( x )/F ( x ) and g( x )/G ( x ), respectively This representation enables us to estimate the quantal information by employing the nonparametric hazard function estimator.

Quantal Fisher Information and Quantal Kullback-Leibler Information
Estimation of the Quantal KL Information
Concluding Remarks
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call