Abstract

AbstractDistributed architectures wherein multiple decision‐making units are employed to coordinate their decision‐making/actions based on real‐time communication have become increasingly important for monitoring processes that have large scales and complex structures. Typically, the development of a distributed monitoring scheme involves two key steps, that is, the decomposition of the process into subsystems, and the design of local monitors based on the configured subsystem models. In this article, we propose a distributed process monitoring approach that tackles both steps for large‐scale processes. A data‐driven process decomposition approach is proposed by leveraging community structure detection to divide variables into subsystems optimally via finding a maximal value of the metric of modularity. A two‐layer distributed monitoring scheme is developed where local monitors are designed based on the configured subsystems of variables using canonical correlation analysis. Inner‐subsystem interactions and inter‐subsystem interactions are tackled by the two layers separately, such that the sensitivity of this monitoring scheme to certain types of faults is improved. We utilize a numerical example to illustrate the effectiveness and superiority of the proposed method. It is then applied to a simulated wastewater treatment process.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call