Abstract
The semantic segmentation of annual rings is a research topic of interest in wood chronology. To solve the problem of wood annual rings being difficult to segment in dense areas and being greatly affected by defects such as cracks and wormholes, this paper builds a DAF-Net++ model which is based on U-Net whose backbone network is VGG16 and filled with dense jump links, CBAM and DCAM. In this model, VGG16 is used to enhance the extraction ability of image features, dense jump links are used to fuse semantic information of different levels, DCAM provides weighting guidance for shallow features, and CBAM solves the loss of down-sampling information. Taking a Chinese fir wood as the experimental object, 1700 CT images of wood transverse section were obtained by medical CT equipment and 120 of them were randomly selected as the dataset, which was expanded by cropping and rotation, among others. DAF-Net++ was used for training the model and segmentation of the annual rings, and finally the performance of the model was evaluated. The training method is freeze training followed by thaw training, and takes Focal Loss as the loss function, ReLU as the activation function, and Adam as the optimizer. The experimental results show that, in the segmentation of CT images of Chinese fir annual rings, the MIoU of DAF-Net++ is 93.67%, the MPA is 96.76%, the PA is 96.63%, and the Recall is 96.76%. Compared with other semantic segmentation models such as U-Net, U-Net++, DeepLabv3+, etc., DAF-Net++ has better segmentation performance.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.