The computation of uncertainty are crucial for developing a reliable machine learning model. The natural posterior network (NatPN) provides uncertainty estimation for any single exponential family distribution, but real-world data is often complex. Therefore, we introduce a mixture exponential family posterior network (MEFDPN), which extends the prior distribution to a mixture of exponential family distributions, aiming to fit complex distributions that better represent real data. During network training, MEFDPN independently updates the posterior Bayesian estimates for each prior distribution, and the weights of these distributions are updated based on the forward propagation results. Furthermore, MEFDPN calculates two types of uncertainty (aleatoric and epistemic) and combines them using entropy weighting to obtain a comprehensive confidence measure for each data point. Theoretically, MEFDPN achieves higher prediction accuracy, and experimental results demonstrate its capability to compute high-quality data comprehensive confidence. Moreover, it shows encouraging accuracy in Out-of-Distribution(OOD) detection and validation experiments. Finally, we apply MEFDPN to a materials dataset, efficiently filtering out OOD data. This results in a significant enhancement of prediction accuracy for machine learning models. Specifically, removing only 5% of outlier data leads to a 2%–5% improvement in accuracy.
Read full abstract