Abstract

With the advent of industry 4.0, multi-sensors are utilized to monitor the degradation process of machinery. When machinery operating, multi-sensor signals have potential relation with each other. However, existing deep-learning-based prognosis models are often limited by lacks of (1) considering temporal and spatial dependencies in multi-sensor signals collected from the dynamic and complex machinery system; (2) uncertainty quantification of remaining useful life (RUL) which plays an essential role in maintenance schedules and spare parts management. Therefore, how to fuse multi-sensor signals and manage uncertainty are two major concerns in deep-learning-based prognosis approaches with multi-sensor signals. To tackle these challenges, a gated graph convolutional network (GGCN) is developed for multi-sensor signal fusion and RUL prediction. Firstly, spatial–temporal graphs are constructed from multi-sensor signals as input of the prognosis model. Next, gated graph convolutional layers are built to accurately extract degradation features by simultaneously modeling the temporal and spatial dependencies in multi-sensor signals. Finally, the extracted features are fed into a quantile regression layer to estimate the RUL and its confidence interval. Experimental results on a simulated graph dataset, a bearing dataset from real wind farm, a turbofan engine dataset and a tool wear dataset validate the effectiveness of the proposed GGCN-based prognosis framework.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call