Abstract

To address data distribution discrepancy across scenarios, deep transfer learning is used to help the target scenario complete the recognition task using similar scenario data. However, fault misrecognition or low diagnostic accuracy occurs due to the weak expression of the deep transfer model in cross-scenario application. The Convolutional Block Attention Module (CBAM) can independently learn the importance of each channel and space features, recalibrate the channel and space features, and improve image classification performance. This study introduces the CBAM module using the Residual Network (ResNet), and proposes a transfer learning model that combines the CBAM module with an improved ResNet, denoted as TL_CBAM_ResNet17. A miniature ResNet17 deep model is constructed based on the ResNet50 model. The location of the CBAM module embedded in the ResNet17 model is determined to strengthen model expression. For effective cross-scenario transfer and reduced data distribution discrepancy between source and target domains, a multi-kernel Maximum Mean Discrepancy (MK–MMD) layer is added in front of the classifier layer in the ResNet17 model to select data with common domain features. Considering a reciprocating compressor as the research object, cross-scenario datasets are produced by the vibration signals from the simulation test bench and simulation signals from the dynamic simulation model. Mutual transfer experiments are conducted using these datasets. The proposed method (TL_CBAM_ResNet17) demonstrates better classification performance than TCA, JDA, the TL_ResNet50 model, the TL_ResNet17 model, and the TL_ResNet17 model integrated with other attention mechanism module, and greatly improves the accuracy of fault diagnosis and generalization of the model in cross-scenario applications.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call