Abstract

Breast cancer constitutes a prevalent and escalating health concern globally. Additionally, the significance of early diagnosis for effective treatment cannot be overstated. Ultrasound imaging (UI) emerges as a cost-effective means for early diagnosis. However, the interpretation of user interface images can be challenging, leading to the development of computer-aided diagnostic systems. In this article, a network architecture named Swin Transformer-based Fork Network (SW-ForkNet) is developed for breast tumor classification. Utilizing the DenseNet121 backbone, SW-ForkNet amalgamates spatial, semantic, and long-context features. The inclusion of the spatial Squeeze-and-Excitation (sSE) block is instrumental for spatial details, while the Swin Transformer structure is employed for capturing global long-context features. Moreover, to optimize the acquisition of these features, a connection is established in the middle layer of the DenseNet121 architecture, feeding both sSE and Swin Transformers. In the network architecture’s output, three distinct feature groups are vectorized, combined, and processed to obtain the final feature map. Finally, a prediction is generated by applying the softmax classifier to this concluding feature map. Experimental evaluations conducted on three datasets (BUSI, GDPH, and SYSUCC) demonstrate the superior performance of SW-ForkNet, showcasing high accuracy (Acc) and F1-Scores (F1S) compared to existing methods and state-of-the-art models. Notable achievements include 93.12 % Acc and 92.27 % F1s for BUSI, 96.15 % Acc and 96.04 % F1S for GDPH, and 94.88 % Acc and 94.03 % F1S for SYSUCC. Consequently, the proposed SW-ForkNet model seems to present itself as a novel and effective structure in breast tumor classification.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call