Abstract
We propose optimized deep learning (DL) models for automatic analysis of udder conformation traits of cattle. One of the traits is represented by supernumerary teats that is in excess of the normal number of teats. Supernumerary teats are the most common congenital heritable in cattle. Therefore, the major advantage of our proposed method is its capability to automatically select the relevant images and thereafter perform supernumerary teat classification when limited data are available. For this purpose, we perform experimental analysis on the image dataset that we collected using a handheld device consisting of a combined depth and RGB camera. To disclose the underlying characteristics of our data, we consider the uniform manifold approximation and projection (UMAP) technique. Furthermore, for comprehensive evaluation, we explore the impact of different data augmentation techniques on the performances of DL models. We also explore the impact of only RGB data and the combination of RGB and depth data on the performances of the DL models. For this purpose, we integrate the three channels of RGB data with the depth channel to generate four channels of data. We present the results of all the models in terms of four performance metrics, namely accuracy, F-score, precision, and sensitivity. The experimental results reveal that a higher level of data augmentation techniques improves the performances of the DL models by approximately 10%. Our proposed method also outperforms the reference methods recently introduced in the literature.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.