Abstract
Breast ultrasound (BUS) image segmentation is a critical procedure in the diagnosis and quantitative analysis of breast cancer. Most existing methods for BUS image segmentation do not effectively utilize the prior information extracted from the images. In addition, breast tumors have very blurred boundaries, various sizes and irregular shapes, and the images have a lot of noise. Thus, tumor segmentation remains a challenge. In this paper, we propose a BUS image segmentation method using a boundary-guided and region-aware network with global scale-adaptive (BGRA-GSA). Specifically, we first design a global scale-adaptive module (GSAM) to extract features of tumors of different sizes from multiple perspectives. GSAM encodes the features at the top of the network in both channel and spatial dimensions, which can effectively extract multi-scale context and provide global prior information. Moreover, we develop a boundary-guided module (BGM) for fully mining boundary information. BGM guides the decoder to learn the boundary context by explicitly enhancing the extracted boundary features. Simultaneously, we design a region-aware module (RAM) for realizing the cross-fusion of diverse layers of breast tumor diversity features, which can facilitate the network to improve the learning ability of contextual features of tumor regions. These modules enable our BGRA-GSA to capture and integrate rich global multi-scale context, multi-level fine-grained details, and semantic information to facilitate accurate breast tumor segmentation. Finally, the experimental results on three publicly available datasets show that our model achieves highly effective segmentation of breast tumors even with blurred boundaries, various sizes and shapes, and low contrast.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.