Abstract
Rényi entropy is an important concept developed by Rényi in information theory. In this paper, we study in detail this measure of information in cases multivariate skew Laplace distributions then we extend this study to the class of mixture model of multivariate skew Laplace distributions. The upper and lower bounds of Rényi entropy of mixture model are determined. In addition, an asymptotic expression for Rényi entropy is given by the approximation. Finally, we give a real data example to illustrate the behavior of entropy of the mixture model under consideration.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have