Abstract

Uterine cancer (also known as endometrial cancer) can seriously affect the female reproductive system, and histopathological image analysis is the gold standard for diagnosing endometrial cancer. Due to the limited ability to model the complicated relationships between histopathological images and their interpretations, existing computer-aided diagnosis (CAD) approaches using traditional machine learning algorithms often failed to achieve satisfying results. In this study, we develop a CAD approach based on a convolutional neural network (CNN) and attention mechanisms, called HIENet. In the ten-fold cross-validation on ∼3,300 hematoxylin and eosin (H&E) image patches from ∼500 endometrial specimens, HIENet achieved a 76.91 ± 1.17% (mean ± s. d.) accuracy for four classes of endometrial tissue, i.e., normal endometrium, endometrial polyp, endometrial hyperplasia, and endometrial adenocarcinoma. Also, HIENet obtained an area-under-the-curve (AUC) of 0.9579 ± 0.0103 with an 81.04 ± 3.87% sensitivity and 94.78 ± 0.87% specificity in a binary classification task that detected endometrioid adenocarcinoma. Besides, in the external validation on 200 H&E image patches from 50 randomly-selected female patients, HIENet achieved an 84.50% accuracy in the four-class classification task, as well as an AUC of 0.9829 with a 77.97% (95% confidence interval, CI, 65.27%∼87.71%) sensitivity and 100% (95% CI, 97.42%∼100.00%) specificity. The proposed CAD method outperformed three human experts and five CNN-based classifiers regarding overall classification performance. It was also able to provide pathologists better interpretability of diagnoses by highlighting the histopathological correlations of local pixel-level image features to morphological characteristics of endometrial tissue.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.