Abstract
Prostate cancer screening often relies on cost-intensive MRIs and invasive needle biopsies. Transrectal ultrasound imaging, as a more affordable and non-invasive alternative, faces the challenge of high inter-class similarity and intra-class variability between benign and malignant prostate cancers. This complexity requires more stringent differentiation of subtle features for accurate auxiliary diagnosis. In response, we introduce the novel Deep Augmented Metric Learning (DAML) network, specifically tailored for ultrasound-based prostate cancer classification. The DAML network represents a significant innovation in the metric learning space, introducing the Semantic Differences Mining Strategy (SDMS) to effectively discern and represent subtle differences in prostate ultrasound images, thereby enhancing tumor classification accuracy. Additionally, the DAML network strategically addresses class variability and limited sample sizes by combining the Linear Interpolation Augmentation Strategy (LIAS) and Permutation-Aided Reconstruction Loss (PARL). This approach enriches feature representation and introduces variability with straightforward structures, mirroring the efficacy of advanced sample generation techniques. We carried out comprehensive empirical assessments of the DAML model by testing its key components against a range of models, ensuring its effectiveness. Our results demonstrate the enhanced performance of the DAML model, achieving classification accuracies of 0.857 and 0.888 for benign and malignant cancers, respectively, underscoring its effectiveness in prostate cancer classification via medical imaging.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.