Abstract

In this article, we study the challenging few-shot fault diagnosis (FSFD) problem where limited faulty samples are available. Metric-based meta-learning methods have been a prevalent approach toward FSFD; however, most of them rely on learning a generalized distance metric and fall short of leveraging intraclass and interclass distribution information. To this end, we develop a novel reweighted regularized prototypical network to improve the performance of FSFD, where an intraclass reweighting strategy is proposed to reduce the influence of noise and outliers and obtain stable estimations of fault prototypes. In addition, a novel balance-enforcing regularization (BER) is proposed to hedge against the between-class imbalance and improve the discrimination capability. These two remedies help to reduce the intraclass difference and enlarge the interclass difference via episodic training. In this way, an improved metric space and a better diagnostic performance can be attained in a few-shot learning context. Case studies on the Tennessee Eastman benchmark process and a real-world railway turnout dataset demonstrate that the proposed FSFD approach compares favorably against state-of-the-art methodologies with desirable diagnostic performance.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.