Abstract

Improving maintenance knowledge intelligence using text data is challenging since the maintenance information is mainly recorded as text. To unlock the knowledge from the maintenance text, a decision-making solution based on retrieving similar cases to help solve new maintenance problems is proposed. In this work, an unsupervised domain fine-tuning technique, Transformer-based Sequential Denoising Auto-Encoder (TSDAE) is used to fine-tune the BERT (Bidirectional Encoder Representations from Transformers) model on domain-specific corpora composed of the Maintenance Work Orders (MWOs). Unsupervised fine-tuning helped the BERT model to adapt MWOs text. Results indicate fine-tuned BERT model can generate semantic matches between MWOs regardless of the complex nature of maintenance text.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call