Abstract

With the advancement of artificial intelligence and machine learning technology, deep learning methods, as a class of data-driven methods, have become more and more popular in Remaining Useful Life (RUL) prediction. Time-series prediction is one main branch in RUL prediction. Gate Recurrent Unit (GRU) networks have shown their effectiveness in various sequence tasks, whose accuracy is decreased when doing long-term prediction. Transformer is considered as a suitable method for capturing long-distance dependencies due to its attention mechanism, but it was originally proposed for Natural Language Processing (NLP) problems. In this paper, we proposed a remaining useful life prediction method based on improved Transformer and GRU(Gformer). Specifically, we used the Transformer encoder to perform feature extraction on the transformed input data. Multihead attention can help to focus on specific parts and capture remote dependencies in the time sequence. Meanwhile, since transformer encoder can't comprehensively analyze the multiple features extracted by the multi-head attention, we combined the GRU networks to solve this problem. Finally, a case study was applied to verify the effectiveness of the proposed method.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.