Abstract
BackgroundThe Gleason grading system is an important clinical practice for diagnosing prostate cancer in pathology images. However, this analysis results in significant variability among pathologists, hence creating possible negative clinical impacts. Artificial intelligence methods can be an important support for the pathologist, improving Gleason grade classifications. Consequently, our purpose is to construct and evaluate the potential of a Convolutional Neural Network (CNN) to classify Gleason patterns.MethodsThe methodology included 6982 image patches with cancer, extracted from radical prostatectomy specimens previously analyzed by an expert uropathologist. A CNN was constructed to accurately classify the corresponding Gleason. The evaluation was carried out by computing the corresponding 3 classes confusion matrix; thus, calculating the percentage of precision, sensitivity, and specificity, as well as the overall accuracy. Additionally, k-fold three-way cross-validation was performed to enhance evaluation, allowing better interpretation and avoiding possible bias.ResultsThe overall accuracy reached 98% for the training and validation stage, and 94% for the test phase. Considering the test samples, the true positive ratio between pathologist and computer method was 85%, 93%, and 96% for specific Gleason patterns. Finally, precision, sensitivity, and specificity reached values up to 97%.ConclusionThe CNN model presented and evaluated has shown high accuracy for specifically pattern neighbors and critical Gleason patterns. The outcomes are in line and complement others in the literature. The promising results surpassed current inter-pathologist congruence in classical reports, evidencing the potential of this novel technology in daily clinical aspects.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.