Abstract Bayesian neural networks (BNNs) combine Bayesian theory with deep learning, providing a probabilistic interpretation of deep learning models. Traditional BNNs typically assume that network’s parameters follow a standard Gaussian distribution, which may not accurately reflect the actual situation. To address this issue, this paper proposes a Recurrent Bayesian Neural Network (RBNN) that employs two trainable Gaussian distributions as priors. Furthermore, an alternating cycle model parameter updation algorithm is designed accordingly, which not only prevents overfitting to specific data during training but also enhances the robustness of model parameters in error approximation. Additionally, this RBNN module can be integrated with classical neural network architecture conveniently. By integrating the RBNN with a temporal convolutional network (TCN), the effectiveness of the proposed approach was validated using a dataset for remaining useful life (RUL) prediction of gear and bearing components in civil aircraft. Comparative studies show that the integrated model outperforms traditional methods in terms of RUL prediction and uncertainty quantification, offering superior accuracy and robustness, and the RMSE of the RBNN prediction results is reduced by 17.7% and score is increased by 27.5% compared to BNN.
Read full abstract