Abstract

The Augmented Transformer U-Net (AugTransU-Net) is proposed to address limitations in existing transformer-related U-Net models for brain tumor segmentation. While previous models effectively capture long-range dependencies and global context, these works ignore the hierarchy to a certain degree and need more feature diversity as depth increases. The proposed AugTransU-Net integrates two advanced transformer modules into different positions within a U-shaped architecture to overcome these issues. The fundamental innovation lies in constructing improved augmentation transformer modules that incorporate Augmented Shortcuts into standard transformer blocks. These augmented modules are strategically placed at the bottleneck of the segmentation network, forming multi-head self-attention blocks and circulant projections, aiming to maintain feature diversity and enhance feature interaction and diversity. Furthermore, paired attention modules operate from low to high layers throughout the network, establishing long-range relationships in both spatial and channel dimensions. This allows each layer to comprehend the overall brain tumor structure and capture semantic information at critical locations. Experimental results demonstrate the effectiveness and competitiveness of AugTransU-Net in comparison to representative works. The model achieves Dice values of 89.7%/89.8%, 78.2%/78.6%, and 80.4%/81.9% for whole tumor (WT), enhancing tumor (ET) and tumor core (TC) segmentation on the BraTS2019-2020 validation datasets, respectively. The code for AugTransU-Net will be made publicly available at https://github.com/MuqinZ/AugTransUnet.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call