Abstract

Abstract Segmentation of lung tumors is an important step in the planning of clinical treatment and operative procedures. Automatic segmentation of advanced lung tumors is challenging due to local invasion and heterogeneous metastasis. Fully automatic deep learning models often fail to accurately delineate advanced disease boundaries and do not allow the user to provide hints to improve segmentation performance. In contrast, interactive segmentation accepts user inputs and is an efficient way to segment lesions while reducing radiologist effort. This paper describes an interactive segmentation approach for lung lesion segmentation in CT scans. Our pipeline is built atop the SwinUNETR neural network architecture and can be expanded to support alternate models. The backbone model accepts radiologists' hints in the form of mouse clicks that indicate whether a location on the scan corresponds to a lesion. Our approach has the following novel contributions: (1) Context-aware encoding of guidance clicks by combining geodesic and Gaussian smoothing, resulting in improved segmentation of lesion boundaries. (2) An iterative prompting strategy to achieve higher accuracy with fewer clicks at inference time. (3) Guidance-aware Conditional Random Fields to refine the produced segmentation masks. (4) Volume cropping around guidance clicks at inference to improve segmentation precision and inference speed. (5) Linearly scaling our approach on multiple GPUs, to improve inference speed. On a public non-small cell lung cancer dataset predominantly containing advanced stage cancers (68\% stage III, and 27\% stage IV), our interactive approach delivers a Dice score of 0.74 with a single click, a 21\% improvement from the fully automatic approach (Dice Score = 0.61), and a Dice score of 0.81 with 5 clicks, which is a 33\% improvement. Inference speed is an important metric for an interactive pipeline. On a typical 350 $\times$ 350 $\times$ 150 voxel chest CT scan volume, our interactive approach infers 5 times faster than the original fully automated approach on a single GPU. After scaling to 4 GPUs, we infer in only 0.5 seconds; 20 times faster than the original approach. Qualitative evaluation from three independent radiologists indicates that our interactive pipeline significantly improved their standard clinical segmentation experience. Citation Format: Mayank Patwari, Yi Wei, Meng Xu, Zhenning Zhang, Konstantinos Sidiropoulos, Balaji Selvaraj, Georgia Hughes, Nadine Garibli, Kamsiriochukwu Ojiako, Mohan Lella, Leon Fedden, James Parkin, Michael Parker, Shaneil Patel, Qin Li, Kedar Patwardhan. Fast, interactive, AI-assisted 3D lung tumour segmentation [abstract]. In: Proceedings of the American Association for Cancer Research Annual Meeting 2024; Part 1 (Regular Abstracts); 2024 Apr 5-10; San Diego, CA. Philadelphia (PA): AACR; Cancer Res 2024;84(6_Suppl):Abstract nr 887.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.