Abstract
In practical industrial applications, it is crucial to train a robust fault diagnosis (FD) model that can quickly adapt to new working conditions or fault modes using a few labeled fault samples. Therefore, a novel convolutional multi-head self-attention network-based meta-transfer learning approach (CMS-MTL) for few-shot fault diagnosis (FSFD) is proposed. Firstly, a convolutional multi-head self-attention network (CMHSAN) is designed, which ingeniously combines the multi-head self-attention (MHSA) blocks and convolution blocks. The local and global feature information of the input time–frequency images are fully considered through the mutual cooperation of MHSA and convolution, so as to fully extract the discriminative features among various fault classes. Secondly, a three-stage CMHSAN-based meta-transfer learning (MTL) scheme is proposed, which provides a good initialization state for the meta-training of the CMHSAN model through the pre-training stage, updates the pre-trained model with the scaling and shifting parameters in the meta-training stage, and fine-tunes the updated model in the meta-testing stage, so as to quickly adapt to new FSFD tasks from the target domain. Thirdly, aiming at the fault classes that are difficult to be diagnosed during meta-training, a meta-task re-training (MTRT) strategy is designed to learn more valuable transferable knowledge in the meta-training stage, thereby improving the adaptability of the CMHSAN model to hard FSFD tasks. Finally, extensive experiments are conducted under different FSFD scenarios to verify the effectiveness of the proposed approach. The results prove that the approach can quickly adapt to new FSFD tasks through the learned meta-knowledge and achieve high diagnosis accuracies.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.