Abstract

Benefiting from the development of medical imaging, the automatic breast image classification has been extensively studied in a variety of breast cancer diagnosis tasks recently. The multi-modality image fusion was helpful to further improve classification performance. However, existing multi-modality fusion methods focused on the fusion of modalities, ignoring the interactions between modalities, which caused the inefficient performance. To address the above issues, we proposed a novel attention-based interactions network for breast tumor classification by using diffusion-weighted imaging (DWI) and apparent dispersion coefficient (ADC) images. Specifically, we proposed a multi-modality interaction mechanism, including relational interaction, channel interaction, and discriminative interaction, to design an attention-based interaction module, which enhanced the abilities of inter-modal interactions. Extensive ablation studies have been carried out, which provably affirmed the advantages of each component. The area under the receiver operating characteristic curve (AUC), accuracy (ACC), specificity (SPC), and sensitivity (SEN) were 87.0%, 87.0%, 88.0%, and 86.0%, respectively, also verifying its effectiveness.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call