Abstract

Abstract 1. Purpose We developed an annotation-free image-translation-based approach for 3D segmentation of prostate glands within 3D-histology datasets of whole biopsies stained with a low-cost and rapid-diffusing fluorescent analog of H&E. 2. Introduction The current diagnostic gold standard of histopathology provides a limited 2D view of complex 3D glandular structures in prostate specimens, which contributes to high interobserver variability and reduced prognostic accuracy. We have recently developed nondestructive 3D pathology methods based on open-top light-sheet (OTLS) microscopy. In order to train prognostic models based on 3D glandular morphology, we have developed an objective (biomarker-based) 3D gland-segmentation method that does not rely upon tedious and subjective manual annotations, and which can operate on images of tissue stained with a cheap and fast-diffusing (small molecule) fluorescent analog of H&E. 3. Methods We first convert H&E-analog images into synthetic 3D immunofluorescence (IF) images of cytokeratin-8 (CK8), a biomarker expressed by the luminal epithelial cells that surround all prostate glands. This conversion was performed by treating the 3D data as sequences of 2D images and adapting a generative adversarial network (GAN)-based video-synthesis model to perform image-sequence translation with high depth-wise continuity between frames. Based on the synthetic CK8 images, glands were then objectively segmented in 3D using a thresholding/morphology-based algorithm. This two-step method obviates the need for labor-intensive and subjective manual 3D annotations, as would be needed to train a single-step segmentation model. A 3D structural similarity (SSIM) index was assessed between synthetic and real CK8 images. In addition, based on ground-truth manual annotations of glands in ten 0.2-mm3 regions, we calculated voxel-based Dice coefficients to compare our segmentation accuracy vs. two baseline methods. 4. Results The synthetic CK8 images exhibited high fidelity (3D SSIM = 0.696) and optimal continuity with depth. Our segmentation accuracy outperformed two baseline methods, with Dice coefficients (averaged for 10 samples) of: 0.882 (our method), 0.725 (3D watershed), 0.643 (2D U-net). We are now applying our method to 3D histology datasets of whole biopsies (n > 1000) acquired ex vivo from prostatectomy specimens (N ~ 200), where we are extracting 3D histomorphometric gland features to predict biochemical recurrence (BCR) post-prostatectomy in men with prostate cancer. 5. Conclusions Our annotation-free segmentation method relies upon generative synthetic 3D IF images from H&E-analog images in order to objectively segment the 3D prostate gland network. These accurate 3D segmentations are being extended to whole-biopsy 3D pathology datasets for prostate cancer risk assessment. Citation Format: Weisi Xie, Adam Glaser, Nicholas Reder, Nadia Postupna, Chenyi Mao, Can Koyuncu, Patrick Leo, Robert Serafin, Hongyi Huang, Anant Madabhushi, Lawrence True, Jonathan T.C. Liu. Annotation-free 3D gland segmentation with generative image-sequence translation for prostate cancer risk assessment [abstract]. In: Proceedings of the AACR Virtual Special Conference on Artificial Intelligence, Diagnosis, and Imaging; 2021 Jan 13-14. Philadelphia (PA): AACR; Clin Cancer Res 2021;27(5_Suppl):Abstract nr PO-017.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call