The combination of computer vision with deep learning has become a popular tool for automation of labor-intensive monitoring tasks in modern livestock farming. However, uncontrolled and varying environmental conditions, which usually prevail in farmhouses, influence the performance of vision-based applications. Image quality can be reduced, for instance by occlusions, illumination or motions of the animals, which can influence the reliability of those applications. To address this issue, this study proposes an approach for the identification of uncertain neural-network predictions to improve the overall prediction quality. It proposes the direct quantification of aleatoric and epistemic uncertainty on the one hand and indirect estimation of uncertainty through the prediction of occlusions on the other hand. Our approach simultaneously integrates the different methods into an end-to-end trainable instance segmentation and regression model. The objective of this study was to first investigate how well the different measures can quantify the uncertainty of a prediction by comparing them to human uncertainty assessments. Then, it was analyzed whether the uncertainty estimations are capable to identify and reject erroneous predictions by evaluating the correlation between the predictive error and the uncertainty estimations. Finally, individual predictions were rejected based on the estimated uncertainties to analyze the effect on the overall accuracy. As a use-case, the developed methods were applied to the prediction of plumage conditions of chickens but also examined in a separate domain. The results showed that the outputs of our approaches for the estimation of aleatoric and epistemic uncertainty correlate to the predictive error of the model, and lead to increased performance when uncertain predictions are rejected. In contrast, the indirect method to identify occluded samples did not serve as a reliable indicator for uncertainty and could therefore not be used to improve the accuracy of the model outputs.
Read full abstract