Abstract

Simple SummaryBreast cancer misdiagnoses increase individual and system stressors as well as costs and result in increased morbidity and mortality. Digital mammography studies are typically about 80% sensitive and 90% specific. Improvement in classification of breast cancer imagery is possible using deep vision methods, and these methods may be further used to identify autonomously regions of interest most closely associated with anomalies to support clinician analysis. This research explores deep vision techniques for improving mammography classification and for identifying associated regions of interest. The findings from this research contribute to the future of automated assistive diagnoses of breast cancer and the isolation of regions of interest.(1) Background: Female breast cancer diagnoses odds have increased from 11:1 in 1975 to 8:1 today. Mammography false positive rates (FPR) are associated with overdiagnoses and overtreatment, while false negative rates (FNR) increase morbidity and mortality. (2) Methods: Deep vision supervised learning classifies 299 × 299 pixel de-noised mammography images as negative or non-negative using models built on 55,890 pre-processed training images and applied to 15,364 unseen test images. A small image representation from the fitted training model is returned to evaluate the portion of the loss function gradient with respect to the image that maximizes the classification probability. This gradient is then re-mapped back to the original images, highlighting the areas of the original image that are most influential for classification (perhaps masses or boundary areas). (3) Results: initial classification results were 97% accurate, 99% specific, and 83% sensitive. Gradient techniques for unsupervised region of interest mapping identified areas most associated with the classification results clearly on positive mammograms and might be used to support clinician analysis. (4) Conclusions: deep vision techniques hold promise for addressing the overdiagnoses and treatment, underdiagnoses, and automated region of interest identification on mammography.

Highlights

  • An estimated 2.3 million women were diagnosed with breast cancer globally in 2020, and female breast cancer has surpassed lung cancer as the most diagnosed cancer in the world [1]

  • This study addresses the problems of overdiagnoses and overtreatment, as well as underdiagnoses with its associated increases in morbidity and mortality by (1) improving classification of mammography using supervised learning

  • Data are publicly available from the Digital Database for Screening Mammography (DDSM) [30] and the Curated Breast Cancer Imaging Subset of DDSM [31] and provided by Google’s Kaggle.com, accessed on 5 January 2021 [32]

Read more

Summary

Introduction

An estimated 2.3 million women were diagnosed with breast cancer globally in 2020, and female breast cancer has surpassed lung cancer as the most diagnosed cancer in the world [1]. The odds of a female being diagnosed with breast cancer have increased from. Breast cancer is the most prevalent type of cancer with the most disability-adjusted life years [3]. Age-adjusted rates show growth in breast cancer diagnoses of 0.3% per year [4,5]. In the United States, about 13% of women will be diagnosed with breast cancer over their lifetimes [6]. Breast cancer fatality rates have declined 1% since 2013, likely due to advancements in treatment, it is still the second most fatal cancer diagnosis [6]

Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.