Abstract

Entropy makes it possible to measure the uncertainty about an information source from the distribution of its output symbols. It is known that the maximum Shannon’s entropy of a discrete source of information is reached when its symbols follow a Uniform distribution. In cryptography, these sources have great applications since they allow for the highest security standards to be reached. In this work, the most effective estimator is selected to estimate entropy in short samples of bytes and bits with maximum entropy. For this, 18 estimators were compared. Results concerning the comparisons published in the literature between these estimators are discussed. The most suitable estimator is determined experimentally, based on its bias, the mean square error short samples of bytes and bits.

Highlights

  • Entropy 2021, 23, 561. https://Entropy allows the measurement of the uncertainty about an information source from the distribution of its output symbols [1]

  • Most of the known estimators are generally applied for the estimation of mutual information [12], but in this scenario entropy estimators will perform and the selection between different estimators is less of an issue

  • A comparison was made between 18 estimators of entropy in short uniformly distributed samples of bytes and bits, based on their bias, variance, and mean square error

Read more

Summary

Introduction

Entropy allows the measurement of the uncertainty about an information source from the distribution of its output symbols [1]. Most of the known estimators are generally applied for the estimation of mutual information [12], but in this scenario entropy estimators will perform and the selection between different estimators is less of an issue. Of entropy is vital since it works with smaller samples [23,24] In these cases, it is necessary to have the estimator with the highest convergence rate and the lowest mean square error to increase the result’s precision. It is not easy to find results of applying or selecting entropy estimators in samples uniformly distributed composed of bytes and bits. The most suitable method to estimate entropy in short samples of bytes and bits uniformly distributed is determined experimentally from the bias and mean square error characteristics. The structure of the paper is as follows: Section 2 presents some preliminaries about entropy and comparison criteria for the estimators; Section 3 discusses some of the entropy estimators from the literature; Section 4 presents the main results, which have to do with the selection of the entropy estimators; Section 5 presents some conclusions and possible future lines of work

Shannon Entropy
Comparison Criterion between the Estimators of H
Entropy Estimators
Theoretical Approximations between Estimators of Entropy
Previous Work on Comparison of Entropy Estimators
Selecting an Effective Entropy Estimator through Experimental Evaluation
Implementation of Entropy Estimators
Analysis of Bias between Estimators
Comparison of Estimators in Terms of Mean Square Error
Correlation between Estimators of Entropy Using Bias
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call