Abstract

Breast tumor segmentation in ultrasound images is fundamental for quantitative analysis and plays a crucial role in the diagnosis and treatment of breast cancer. Recently, existing methods have mainly focused on spatial domain implementations, with less attention to the frequency domain. In this paper, we propose a Multi-frequency and Multi-scale Interactive CNN-Transformer Hybrid Network (MFMSNet). Specifically, we utilize Octave convolutions instead of conventional convolutions to effectively separate high-frequency and low-frequency components while reducing computational complexity. Introducing the Multi-frequency Transformer block (MF-Trans) enables efficient interaction between high-frequency and low-frequency information, thereby capturing long-range dependencies. Additionally, we incorporate Multi-scale interactive fusion module (MSIF) to merge high-frequency feature maps of different sizes, enhancing the emphasis on tumor edges by integrating local contextual information. Experimental results demonstrate the superiority of our MFMSNet over seven state-of-the-art methods on two publicly available breast ultrasound datasets and one thyroid ultrasound dataset. In the evaluation of MFMSNet, tests were conducted on the BUSI, BUI, and DDTI datasets, comprising 130 images (BUSI), 47 images (BUI), and 128 images (DDTI) in the respective test sets. Employing a five-fold cross-validation approach, the obtained dice coefficients are as follows: 83.42 % (BUSI), 90.79 % (BUI), and 79.96 % (DDTI). The code is available at https://github.com/wrc990616/MFMSNet.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call