Abstract

The concepts of entropy and divergence, along with their past, residual, and interval variants are revisited in a reliability theory context and generalized families of them that are based on ϕ-functions are discussed. Special emphasis is given in the parametric family of entropies and divergences of Cressie and Read. For non-negative and absolutely continuous random variables, the dual to Shannon entropy measure of uncertainty, the extropy, is considered and its link to a specific member of the ϕ-entropies family is shown. A number of examples demonstrate the implementation of the generalized entropies and divergences, exhibiting their utility.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.