Abstract

Extropy, as a complementary dual of entropy, has been discussed in many works of literature, where it is declared for other measures as an extension of extropy. In this article, we obtain the extropy of generalized order statistics via its dual and give some examples from well-known distributions. Furthermore, we study the residual and past extropy for such models. On the other hand, based on Farlie–Gumbel–Morgenstern distribution, we consider the residual extropy of concomitants of m-generalized order statistics and present this measure with some additional features. In addition, we provide the upper bound and stochastic orders of it. Finally, nonparametric estimation of the residual extropy of concomitants of m-generalized order statistics is included using simulated and real data connected with COVID-19 virus.

Highlights

  • IntroductionShannon [1] introduced a well-known vintage measure of uncertainty called Shannon entropy. is information theoretic entropy manipulates in diverse fields such as financial analysis, computer science, and medical research

  • Shannon [1] introduced a well-known vintage measure of uncertainty called Shannon entropy. is information theoretic entropy manipulates in diverse fields such as financial analysis, computer science, and medical research.e extropy proposed by Lad et al [2] is an accomplishment to notions of information based on entropy. ey exhibited that entropy has a complementary dual function known as “extropy.” In the view of extropy in discrete density, the e(x− t1r/o2p)y􏽐mNi e1aθs2iurweh−en􏽐tNi h1e(r1a−ngθei)log(1 − θi) is neatly closer to of possibilities increases. erefore, to realize extropy for a continuous density, the extropy of a nonnegative continuous random variable (r.v.) X, with probability density function (PDF) f(x) is defined as J(X) − ∞ 􏽚 f2(x)dx. (1)e extropy measure has been developed for ordered variables

  • E extropy proposed by Lad et al [2] is an accomplishment to notions of information based on entropy. ey exhibited that entropy has a complementary dual function known as

Read more

Summary

Introduction

Shannon [1] introduced a well-known vintage measure of uncertainty called Shannon entropy. is information theoretic entropy manipulates in diverse fields such as financial analysis, computer science, and medical research. Roughout this paper, we propose the extropy of m − gos and m − dgos and study those models for the related measures of extropy. We consider this model in terms of its upper bound and produce some examples on it.

Extropy of m-Generalized Order Statistics and Its Dual
Residual and Past Extropy of m-Generalized Order Statistics and Its Dual
Residual Extropy of Concomitants of m-Generalized Order Statistics
Stochastic Orders
Nonparametric Estimation
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call