Abstract
Remaining Useful Life (RUL) estimation is a fundamental task in the prognostic and health management (PHM) of industrial equipment and systems. To this end, we propose a novel approach for RUL estimation in this paper, based on deep neural architecture due to its great success in sequence learning. Specifically, we take the Transformer encoder as the backbone of our model to capture short- and long-term dependencies in a time sequence. Compared with convolutional neural network based methods, there is no limitation from the kernel size for a complete receptive field of all time steps. While compared with recurrent neural network based methods, we develop our model based on dot-product self-attention, enabling it to fully exploit parallel computation. Moreover, we further propose a gated convolutional unit to facilitate the model’s ability of incorporating local contexts at each time step, for the attention mechanism used in the Transformer encoder makes the output high-level features insensitive to local contexts. We conduct experiments on the C-MAPSS datasets and show that, the performance of our model is superior or comparable to those of other existing methods. We also carry out ablation studies and demonstrate the necessity and effectiveness of each component used in the proposed model.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.