Abstract
Objective.Ultrasound is the primary screening test for breast cancer. However, providing an interpretable auxiliary diagnosis of breast lesions is a challenging task. This study aims to develop an interpretable auxiliary diagnostic method to enhance usability in human-machine collaborative diagnosis.Approach.To address this issue, this study proposes the deep multi-stage reasoning method (DMSRM), which provides individual and overall breast imaging-reporting and data system (BI-RADS) assessment categories for breast lesions. In the first stage of the DMSRM, the individual BI-RADS assessment network (IBRANet) is designed to capture lesion features from breast ultrasound images. IBRANet performs individual BI-RADS assessments of breast lesions using ultrasound images, focusing on specific features such as margin, contour, echogenicity, calcification, and vascularity. In the second stage, evidence reasoning (ER) is employed to achieve uncertain information fusion and reach an overall BI-RADS assessment of the breast lesions.Main results.To evaluate the performance of DMSRM at each stage, two test sets are utilized: the first for individual BI-RADS assessment, containing 4322 ultrasound images; the second for overall BI-RADS assessment, containing 175 sets of ultrasound image pairs. In the individual BI-RADS assessment of margin, contour, echogenicity, calcification, and vascularity, IBRANet achieves accuracies of 0.9491, 0.9466, 0.9293, 0.9234, and 0.9625, respectively. In the overall BI-RADS assessment of lesions, the ER achieves an accuracy of 0.8502. Compared to independent diagnosis, the human-machine collaborative diagnosis results of three radiologists show increases in positive predictive value by 0.0158, 0.0427, and 0.0401, in sensitivity by 0.0400, 0.0600 and 0.0434, and in area under the curve by 0.0344, 0.0468, and 0.0255.Significance.This study proposes a DMSRM that enhances the transparency of the diagnostic reasoning process. Results indicate that DMSRM exhibits robust BI-RADS assessment capabilities and provides an interpretable reasoning process that better suits clinical needs.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.