Abstract

This paper studies mutual information and transfer entropy for detection of cause and effect relationships between industrial process variables. Mutual information quantifies the amount of dependency between process variables, while transfer entropy detects the direction of information flow between the variables. The paper overviews the existing definition and limitations of these two quantities and proposes an algorithm based on combining and extending these two quantities for more reliable identification of causal relationship between process variables. Detection of causal relationships between plant variables is useful for diagnosis of the root cause of a distributed fault in the process. It also helps predicting the effect variables. The proposed method is illustrated through an industrial case study.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call