Abstract

Tsallis introduced a non-logarithmic generalization of Shannon entropy, namely Tsallis entropy, which is non-extensive. Sati and Gupta proposed cumulative residual information based on this non-extensive entropy measure, namely cumulative residual Tsallis entropy (CRTE), and its dynamic version, namely dynamic cumulative residual Tsallis entropy (DCRTE). In the present paper, we propose non-parametric kernel type estimators for CRTE and DCRTE where the considered observations exhibit an -mixing dependence condition. Asymptotic properties of the estimators were established under suitable regularity conditions. A numerical evaluation of the proposed estimator is exhibited and a Monte Carlo simulation study was carried out.

Highlights

  • Shannon [1] made his signature in statistics by introducing the concept of entropy, a measure of disorder in probability distribution

  • We propose non-parametric estimators of cumulative residual Tsallis entropy (CRTE) and dynamic cumulative residual Tsallis entropy (DCRTE) using kernel type estimation based on the assumption that underlying lifetimes are assumed to be ρ-mixing

  • Non-parametric kernel type estimators for CRTE and DCRTE were proproposed for observations which exhibit ρ-mixing dependence

Read more

Summary

Introduction

Shannon [1] made his signature in statistics by introducing the concept of entropy, a measure of disorder in probability distribution. Associated with an absolutely continuous random variable X with probability density function (pdf) f ( x ), cumulative distribution function (cdf) F ( x ) and survival function (sf) F ( x ) = 1 − F ( x ), Shannon entropy is defined as Accepted: 19 December 2021 ζ (X) = −. Z where log(·) is the natural logarithm with standard convention 0 log 0 = 0. Nowadays, this measure has gained a peculiar place in sciences such as physics, chemistry, computer sciences, wavelet analysis, image recognition and fuzzy sets. Following the pioneering work of Shannon, the available literature has generated a significant amount of papers related to it, obtained by incorporating some additional parameters which make these entropies sensitive to different the shapes of probability distributions

Objectives
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call