Abstract
Due to the variable size of the sheep carcass and the complex characteristics of the surface tissue of the hind legs, the recognition accuracy of the segmented target muscle area is low. This paper proposes a method for detecting the segmentation features of sheep carcass hind legs and carries out a segmentation test to validate it. The approach takes the multi-scale dual attention U-Net (MDAU-Net) semantic segmentation network as its core. It effectively combines different layer features, spatial attention modules, and channel attention modules. We design a multi-scale dual attention (MDA) module to enhance multi-scale contextual semantic and local detail features, and embeds it into the U-Net hopping layer connection to obtain the specific semantic features and local details features of the coding stage. The experimental results show that the Pre and MIoU of the MDAU-Net semantic segmentation network on the self-built sheep carcass hind leg region data set are 93.76% and 86.94% respectively. Both are better than the control group, which proves the segmentation accuracy of this method on the sheep carcass hind leg dataset. The actual segmentation is implemented based on the result of feature recognition. The average offset distance of the tool target undercutting point was 4.02 mm, and the average segmentation residual rate was 6.28%, which basically met the requirements of the primary segmentation of the sheep carcass hind legs. This study can provide more technical references for the autonomous segmentation technology of livestock meat.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.