Abstract
Accurate lesion segmentation plays a crucial role in the clinical diagnosis of honeycomb lung. However, the diversity of honeycomb lung shapes, distributions, and textures makes honeycomb lung segmentation a challenging task. Nonnegligible drawbacks exist when using pure Convolutional Neural Network (CNN) and pure Transformer methods for lesion segmentation. The convolution operation in CNN cannot capture comprehensive global information, which seriously affects the accuracy of lesion segmentation. The Transformer, which can extract global features effectively, is very sensitive to the positional information in the input sequence, and is difficult to capture local features, which makes the model very easy to overfit or generate inaccurate segmentation results. To solve the above problems, a segmentation model is proposed for optimizing skip connections of U-shaped convolutional networks using a channel Transformer, which uses the Gaussian context transformer and ConvMixer to optimize the encoder-decoder structure of the network. In addition, simple skip connections are improved using the Pyramidal Pooling Channel Transformer. This module efficiently exploits the semantic information in multi-scale channel features, thus giving sufficient information for decoding. Compared with other classical deep learning methods, our model has better results for the segmentation of honeycomb lung lesions, where the IoU is 89.12%, Dice coefficient is 93.82%, mIoU is 94.45%, and precision is 91.31%, which is better than other methods.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.