Abstract

Traditional methods of diagnosing breast cancer (BC) suffer from human errors, are less accurate, and consume time. A computer-aided detection (CAD) system can overcome the above-stated limitations and help radiologists with accurate decision-making. However, the existing studies using single imaging modalities have shown limited clinical use due to its low diagnostic accuracy and reliability when compared to multimodal system. Thus, we aim to develop a hybrid deep learning bimodal CAD algorithm for the classification of breast lesions using mammogram and ultrasound imaging modalities combined. A combined convolutional neural network (CNN) and long-short term memory (LSTM) model is implemented using images from both mammogram and ultrasound modalities to improve the early diagnosis of BC. A new real-time dataset consisting of 43 mammogram images and 43 ultrasound images collected from 31 patients is used in this work. Further, each group consists of 25 benign and 18 malignant images. The number of images is increased to 1032 (516 for each modality) using different data augmentation techniques. The proposed bimodal CAD algorithm achieves a classification accuracy of 99.35% and the area under the receiver operating characteristic curve (AUC) of 0.99 over the traditional unimodal CAD systems, which attain the classification accuracy of 97.16% and 98.84% using mammogram and ultrasound, respectively. The proposed bimodal CAD algorithm using combined mammogram and ultrasound outperforms the traditional unimodal CAD systems. The bimodal CAD algorithm can avoid unnecessary biopsies and encourage its clinical application.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call