Abstract

Data acquisition in process industries usually takes place at each sampling. The disadvantage is that a considerable amount of data without new information about the state of the process is continuously transmitted and processed. This negatively affects the communication system and computational power, which is more critical nowadays given the number of variables measured, even in seconds. One solution concerns the event-driven paradigm, in which only relevant data according to a pre-defined criterion is forwarded for further processing. This work investigated the event-based threshold and delta methods in the context of fault detection. The data transmission rate was also analyzed. The well-know Tennessee Eastman problem (TEP) was used as a case study. The fault detection system was based on PCA (principal component analysis), which is widely used for this purpose in this benchmark. The results were compared with the commonly used time-based approach, for a fixed false alarm rate. The threshold rule provided similar results, but with much less data. For the delta rule, significant MDR (missed detection rate) gains of up to 74% were obtained for five of the six hard-to-detect faults, and of up to 69%, for two of the three very hard-to-detect faults. MDR values very close to zero were reached for two of the three intermediate detection faults and two of the hard-to-detect faults. The detection time was also evaluated. In this regard, considerably lower values were obtained for all intermediate detection faults, three of the hard-to-detect faults and all very hard-to-detect faults. In short, the delta method was able to improve fault detection performance, especially for hard-to-detect faults, with a considerably lower data transmission rate, around 20% on average. Event-driven data acquisition can be very attractive for process industries.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call