Abstract

Fault diagnosis in an open world refers to the diagnosis tasks that need to cope with previously unknown faults in the online stage. It faces a great challenge yet to be addressed—that is, the online data of unknown faults may be classified as normal samples with a high probability. In this article, we develop an effective solution for this challenge by using supervised contrastive learning to learn a discriminative and compact embedding for the known normal situation and fault situations. Specifically, in addition to contrasting a given sample with other instances as is the case in conventional contrastive learning methods, our training scheme contrasts the normal samples with negative augmentations of themselves. The negative out-of-distribution data is generated by the Soft Brownian Offset sampling method to simulate the previously unknown faults. Computational experiments are conducted on the Tennessee Eastman Process benchmark dataset and a practical plasma etching process dataset. The proposed method achieves significant improvement compared with four existing methods under three open-set fault diagnosis circumstances, i.e., balanced open-set fault diagnosis, imbalanced fault diagnosis, and few-shot fault diagnosis. This demonstrates its great potentials in real world fault diagnosis applications.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.