Abstract

Due to complex spatial correlations, dynamic temporal trends, and heterogeneities, accurate remaining useful life (RUL) prediction is a challenging task for multi-sensor complex systems. Existing frameworks usually design complex graph convolutional networks (GCNs) for multi-sensor information fusion to capture shared patterns with predefined graphs. However, predefined graphs do not necessarily reflect correct and complete correlations among sensors. Furthermore, dynamic temporal trend extraction based on an iterative mechanism will bring the challenge of error accumulation from a global perspective and ignore the heterogeneous correlations. To overcome these limitations, a novel graph neural network framework, namely, Spatial–temporal Dual-channel Adaptive Graph Convolutional Network (SDAGCN), is proposed for RUL prediction. It mainly consists of dual channels, including the local and global spatial–temporal modules with learnable graphs, which adaptively capture hidden spatial correlations. Benefiting from these two modules, SDAGCN can effectively extract hidden spatial correlations along the local and global time axis and heterogeneities. Finally, the superior performance of our model is verified by two simulated aircraft engine dataset with multiple sensors.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call