Abstract
Predictive Business Process Monitoring (PBPM) is one of the essential tasks in Business Process Management (BPM). It aims to predict the future behavior of an ongoing case using completed cases of a process stored in the event log, such as the prediction of the next activity and outcome of the case, etc. Although various deep learning methods have been proposed for PBPM, none of them consider the simultaneous application to multiple predictive tasks. This paper proposes a multi-task prediction method based on BERT and Transfer Learning. First, the method performs the Masked Activity Model (MAM) of a self-supervised pre-training task on many unlabeled traces using BERT (Bidirectional Encoder Representations from Transformers). The pre-training task MAM captures the bidirectional semantic information of the input traces using the bidirectional Transformer structure in BERT. It obtains the long-term dependencies between activities using the Attention mechanism in the Transformer. Then, the universal representation model of the traces is obtained. Finally, two different models are defined for two prediction tasks of the next activity and the outcome of the case, respectively, and the pre-trained model is transferred to the two prediction models for training using the fine-tuning strategy. Experiments evaluation on eleven real-world event logs shows that the performance of the prediction tasks is affected by different masking tactics and masking probabilities in the pre-training task MAM. This method performs well in the next activity prediction task and the case outcome prediction task. It can be applied to several different prediction tasks faster and with more outstanding performance than the direct training method.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.