Abstract

Automated breast ultrasound image segmentation is essential in a computer-aided diagnosis (CAD) system for breast tumors. In this article, we present a feature pyramid nonlocal network (FPNN) with transform modal ensemble learning (TMEL) for accurate breast tumor segmentation in ultrasound images. Specifically, the FPNN fuses multilevel features under special consideration of long-range dependencies by combining the nonlocal module and feature pyramid network. Additionally, the TMEL is introduced to guide two iFPNNs to extract different tumor details. Two publicly available datasets, i.e., the Dataset-Cairo University and Dataset-Merge, were used for evaluation. The proposed FPNN-TMEL achieves a Dice score of 84.70% ± 0.53%, Jaccard Index (Jac) of 78.10% ± 0.48% and Hausdorff distance (HD) of 2.815 ± 0.016 mm on the Dataset-Cairo University, and Dice of 87.00% ± 0.41%, Jac of 79.16% ± 0.56%, and HD of 2.781±0.035 mm on the Dataset-Merge. Qualitative and quantitative experiments show that our method outperforms other state-of-the-art methods for breast tumor segmentation in ultrasound images. Our code is available at https://github.com/pixixiaonaogou/FPNN-TMEL.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.