Abstract

The order-q Tsallis and Renyi entropy receive broad applications in the statistical analysis of complex phenomena. A generic problem arises, however, when these entropies need to be estimated from observed data. The finite size of data sets can lead to serious systematic and statistical errors in numerical estimates. In this paper, we focus upon the problem of estimating generalized entropies from finite samples and derive the Bayes estimator of the order-q Tsallis entropy, including the order-1 (i.e. the Shannon) entropy, under the assumption of a uniform prior probability density. The Bayes estimator yields, in general, the smallest mean-quadratic deviation from the true parameter as compared with any other estimator. Exploiting the functional relationship between and , we use the Bayes estimator of to estimate the Renyi entropy . We compare these novel estimators with the frequency-count estimators for and . We find by numerical simulations that the Bayes estimator reduces statistical errors of order-q entropy estimates for Bernoulli as well as for higher-order Markov processes derived from the complete genome of the prokaryote Haemophilus influenzae.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call