Abstract
Accurate and rapid discrimination between benign and malignant ovarian masses is crucial for optimal patient management. This study aimed to establish an ultrasound image-based nomogram combining clinical, radiomics, and deep transfer learning features to automatically classify the ovarian masses into low risk and intermediate-high risk of malignancy lesions according to the Ovarian- Adnexal Reporting and Data System (O-RADS). The ultrasound images of 1,080 patients with 1,080 ovarian masses were included. The training cohort consisting of 683 patients was collected at the South China Hospital of Shenzhen University, and the test cohort consisting of 397 patients was collected at the Shenzhen University General Hospital. The workflow included image segmentation, feature extraction, feature selection, and model construction. The pre-trained Resnet-101 model achieved the best performance. Among the different mono-modal features and fusion feature models, nomogram achieved the highest level of diagnostic performance (AUC: 0.930, accuracy: 84.9%, sensitivity: 93.5%, specificity: 81.7%, PPV: 65.4%, NPV: 97.1%, precision: 65.4%). The diagnostic indices of the nomogram were higher than those of junior radiologists, and the diagnostic indices of junior radiologists significantly improved with the assistance of the model. The calibration curves showed good agreement between the prediction of nomogram and actual classification of ovarian masses. The decision curve analysis showed that the nomogram was clinically useful. This model exhibited a satisfactory diagnostic performance compared to junior radiologists. It has the potential to improve the level of expertise of junior radiologists and provide a fast and effective method for ovarian cancer screening.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.