Abstract

Transfer entropy is a measure of causality that has been widely applied and one of its identities is the sum of mutual information terms. In this article we evaluate two existing methods of mutual information estimation in the specific application of detecting causality between a discrete random process and a continuous random process: binning method and nearest neighbours method. Simulated examples confirm, in the overall scenario, that the nearest neighbours method detects causality more reliably than the binning method.

Highlights

  • Transfer entropy (TE), as well as Granger causality and directed information, is a measure of causality

  • The contribution of this paper is to investigate the application of both binning method and nearest neighbours method described above for mutual information, in equation

  • In order to evaluate the performance of these methods, we have developed some examples involving causality in mixed cases

Read more

Summary

INTRODUCTION

Transfer entropy (TE), as well as Granger causality and directed information, is a measure of causality. The purpose of this paper is to evaluate TE estimators for these mixed cases, which may be of interest for those working with mixed processes and with a causality measure necessity. This may be relevant in the context of a Poisson channel with feedback. Two methods of estimation are explored in this paper for this case of mixed processes. Both methods stems from an identity for TE, written as a sum of two mutual information terms.

NOTATION AND TERMINOLOGY
DEFINITIONS
ESTIMATORS
Binning Method
Nearest Neighbours Method
RESULTS
First Example
Second Example
Third example
CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call