Abstract

Mixture models are in high demand for machine-learning analysis due to their computational tractability, and because they serve as a good approximation for continuous densities. Predominantly, entropy applications have been developed in the context of a mixture of normal densities. In this paper, we consider a novel class of skew-normal mixture models, whose components capture skewness due to their flexibility. We find upper and lower bounds for Shannon and Rényi entropies for this model. Using such a pair of bounds, a confidence interval for the approximate entropy value can be calculated. In addition, an asymptotic expression for Rényi entropy by Stirling’s approximation is given, and upper and lower bounds are reported using multinomial coefficients and some properties and inequalities of L p metric spaces. Simulation studies are then applied to a swordfish (Xiphias gladius Linnaeus) length dataset.

Highlights

  • Mixture models are in high demand for machine-learning analysis, due to their computational tractability and for offering a good approximation for continuous densities [1]

  • We present practical results of upper and lower bounds of Shannon and Rényi entropies for finite mixture of multivariate skew-normal mixture (FMSN) distributions in Sections 3.1 and 3.2, respectively, which ought to be considered in numerical simulations and real-world application (Section 4)

  • To study the behavior of the Shannon entropy bounds of Proposition 2 and the Rényi entropy bounds of Equation (13) and Lemma 1, some examples are simulated for the cases d = 1, 2 and 3: Example 1: d = 1, m = 2, π = (0.3, 0.7), ξ = (0.5, 5), Ω = (3.5, 6), and η = (0.5, 3.5); Example 2: [24] d = 1, m = 3, π = (0.5, 0.2, 0.3), ξ = (2, 20, 35), Ω = (9, 16, 9), and η = (5, 3, 6); Example 3: [24] d = 2, m = 2, π = (0.65, 0.35)

Read more

Summary

Introduction

Mixture models are in high demand for machine-learning analysis, due to their computational tractability and for offering a good approximation for continuous densities [1]. No analytical expressions, which consider bounds of Shannon entropy for the normal mixture entropy, exist. The entropy applications mentioned above have been developed in the normal context, but several results of both Shannon and Rényi entropies for various multivariate distributions (see, e.g., [11,12]) exist. We calculate the bounds for Shannon and Rényi entropies for the skew-normal mixture model. The maximum entropy theorem and Jensen’s inequality are considered for the Shannon entropy case Using such a pair of bounds, a confidence interval for the approximate entropy value can be calculated.

Skew-Normal Distribution
Finite Mixtures of Skew-Normal Distributions
Entropies
Results
Shannon Entropy Bounds
Rényi Entropy Bounds
Simulations
Application
Data and Software
Length–Weight Relationship
Clustering and Model Selection
Methodology
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.