Abstract

Breast Ultrasound (BUS) imaging has been recognized as an essential imaging modality for breast masses classification in China. Current deep learning (DL) based solutions for BUS classification seek to feed ultrasound (US) images into deep convolutional neural networks (CNNs), to learn a hierarchical combination of features for discriminating malignant and benign masses. One existing problem in current DL-based BUS classification was the lack of spatial and channel-wise features weighting, which inevitably allow interference from redundant features and low sensitivity. In this study, we aim to incorporate the instructive information provided by breast imaging reporting and data system (BI-RADS) within DL-based classification. A novel DL-based BI-RADS Vector-Attention Network (BVA Net) that trains with both texture information and decoded information from BI-RADS stratifications was proposed for the task. Three baseline models, pre-trained DenseNet-121, ResNet-50 and Residual-Attention Network (RA Net) were included for comparison. Experiments were conducted on a large scale private main dataset and two public datasets, UDIAT and BUSI. On the main dataset, BVA Net outperformed other models, in terms of AUC (area under the receiver operating curve, 0.908), ACC (accuracy, 0.865), sensitivity (0.812) and precision (0.795). BVA Net also achieved the high AUC (0.87 and 0.882) and ACC (0.859 and 0.843), on UDIAT and BUSI. Moreover, we proposed a method that integrates both BVA Net binary classification and BI-RADS stratification estimation, called integrated classification. The introduction of integrated classification helped improving the overall sensitivity while maintaining a high specificity.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call