Abstract

The recent introduction of geometric partition entropy offered an alternative to differential Shannon entropy for the quantification of uncertainty as estimated from a sample drawn from a one-dimensional bounded continuous probability distribution. In addition to being a fresh perspective for the basis of continuous information theory, this new approach provided several improvements over traditional entropy estimators including its effectiveness on sparse samples and a proper incorporation of the impact from extreme outliers. However, a complimentary relationship exists between the new geometric approach and the basic form of its frequency-based predecessor that is leveraged here to define an entropy measure with no bias toward the sample size. This stable normalized measure is named the Boltzmann-Shannon interaction entropy (BSIE)) as it is defined in terms of a standard divergence between the measure-based and frequency-based distributions that can be associated with the two historical figures. This parameter-free measure can be accurately estimated in a computationally efficient manner, and we illustrate its utility as a quality metric for subsampling in the context of nonlinear polynomial regression.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call