Fault diagnosis in modern industrial and information systems is critical for ensuring equipment reliability and operational safety, but traditional methods have difficulty in effectively capturing spatiotemporal dependencies and fault-sensitive features in multi-sensor data, especially rarely considering dynamic features between multi-sensor data. To address these challenges, this study proposes DyGAT-FTNet, a novel graph neural network model tailored to multi-sensor fault detection. The model dynamically constructs association graphs through a learnable dynamic graph construction mechanism, enabling automatic adjacency matrix generation based on time–frequency features derived from the short-time Fourier transform (STFT). Additionally, the dynamic graph attention network (DyGAT) enhances the extraction of spatiotemporal dependencies by dynamically assigning node weights. The time–frequency graph pooling layer further aggregates time–frequency information and optimizes feature representation.Experimental evaluations on two benchmark multi-sensor fault detection datasets, the XJTUSuprgear dataset and SEU dataset, show that DyGAT-FTNet significantly outperformed existing methods in classification accuracy, with accuracies of 1.0000 and 0.9995, respectively, highlighting its potential for practical applications.
Read full abstract