Abstract

For industrial processes, there are always some specific faults which are not easy to be detected by the conventional PCA algorithm since the monitoring models are defined based on the general distribution information of normal data which may not highlight the abnormal changes. For these specific faults, if fault data are available and used for model development, more meaningful directions may be extracted for monitoring which can improve fault detection sensitivity. In the present work, a fault-relevant principal component analysis (FPCA) algorithm is proposed for statistical modeling and process monitoring by using both normal and fault data. The key is how to extract and supervise the fault-influential data distribution directions. By analyzing the relative changes from normal to fault with available fault data, the new model structure further decomposes the original PCA systematic subspace and residual subspace into two parts respectively. The part that will present larger variation relative to the normal case under the disturbance of fault is regarded to be more informative for fault detection (called fault-relevant part here). It is then separated from the fault-irrelevant part and highlighted for online monitoring which is deemed to be more effective for fault detection. The proposed method provides a detailed insight into the decomposition of the original normal process information from the fault-relevant perspective. Its sensitivity to fault detection is illustrated by data from a numerical example and the Tennessee Eastman process.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call