Abstract

This paper deals with the control of bias estimation when estimating mutual information from a nonparametric approach. We focus on continuously distributed random data and the estimators we developed are based on a nonparametric k-nearest-neighbor approach for arbitrary metrics. Using a multidimensional Taylor series expansion, a general relationship between the estimation error bias and the neighboring size for the plug-in entropy estimator is established without any assumption on the data for two different norms. The theoretical analysis based on the maximum norm developed coincides with the experimental results drawn from numerical tests made by Kraskov et al. [Phys. Rev. E 69, 066138 (2004)PLEEE81539-375510.1103/PhysRevE.69.066138]. To further validate the novel relation, a weighted linear combination of distinct mutual information estimators is proposed and, using simulated signals, the comparison of different strategies allows for corroborating the theoretical analysis.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call