Abstract

Breast cancer can be diagnosed using medical imaging. Classification performance of medical imaging can be improved by multi-modality image fusion. However, existing fusion algorithm fail to consider the importance of modality interactions and cannot fully utilize multi-modality information. Attention mechanisms can effectively explore and combine multi-modality information. Thus, we propose a novel triple-attention interaction network for breast tumor classification based on diffusion-weighted imaging (DWI) and apparent dispersion coefficient (ADC) images. A triple inter-modality interaction mechanism is proposed to fully fuse the multi-modality information. Three modal interactions were performed through the developed inter-modality relation module, channel interaction module, and multi-level attention fusion module to explore the correlation, complementary, and discriminative information, respectively. Additionally, we introduce a novel dual parallel-attention module for the incorporation of spatial and channel attention to improve the discriminative ability of single-modality features. Using these mechanisms, the proposed algorithm can mine and explore useful multi-modality information fully, to improve classification performance. Experimental results demonstrate that our algorithm outperforms other multi-modality fusion algorithm, and extensive ablation studies were conducted to verify the advantages of our algorithm. The area under the receiver operating characteristic curve, accuracy, specificity, and sensitivity were 90.5%, 89.0%, 85.6%, and 92.4%, respectively.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call