Abstract
The latest developments combining deep learning technology and medical image data have attracted wide attention and provide efficient noninvasive methods for the early diagnosis of breast cancer. The success of this task often depends on a large amount of data annotated by medical experts, which is time-consuming and may not always be feasible in the biomedical field. The lack of interpretability has greatly hindered the application of deep learning in the medical field. Currently, deep stable learning, including causal inference, make deep learning models more predictive and interpretable. In this study, to distinguish malignant tumors in Breast Imaging-Reporting and Data System (BI-RADS) category 3-4A breast lesions, we propose BD-StableNet, a deep stable learning model for the automatic detection of lesion areas. In this retrospective study, we collected 3103 breast ultrasound images (1418 benign and 1685 malignant lesions) from 493 patients (361 benign and 132 malignant lesion patients) for model training and testing. Compared with other mainstream deep learning models, BD-StableNet has better prediction performance (accuracy = 0.952, area under the curve (AUC) = 0.982, precision = 0.970, recall = 0.941, F1-score = 0.955 and specificity = 0.965). The lesion area prediction and class activation map (CAM) results both verify that our proposed model is highly interpretable. The results indicate that BD-StableNet significantly enhances diagnostic accuracy and interpretability, offering a promising noninvasive approach for the diagnosis of BI-RADS category 3-4A breast lesions. Clinically, the use of BD-StableNet could reduce unnecessary biopsies, improve diagnostic efficiency, and ultimately enhance patient outcomes by providing more precise and reliable assessments of breast lesions.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Similar Papers
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.