Abstract
13 Objectives: In normal free-breathing PET/CT imaging protocol, a single CT image of the patient is obtained before PET scan and used for attenuation correction of PET data. A CT scanner with long axial field of view and fast rotation speed is able to capture the patient body at one respiratory phase even with free breathing during acquisition. However, since breath-holding is infeasible during PET scan, the single CT image is inadequate for attenuation correction of the whole PET data. The aim of this study is to design an automatic workflow to correct mismatch of PET and CT data obtained in free-breathing scans of different tracers. Methods: First, the PET data was divided into four equal-count respiratory frames based on a previously developed data-driven respiratory gating method. The attenuation map from CT scan was modified by filling lung region with attenuation coefficient of soft tissue before use in attenuation correction of the gated PET data to reduce activity-attenuation mismatch. Second, the respiratory phase of the original CT image was identified automatically. Two regions of interests (ROIs), each containing one of the two lungs and organs below the lung, were segmented from CT image. The mutual information (MI) inside the ROIs of each gated PET reconstruction and the CT image was measured for all gates. Since the left ROI with lung-stomach boundary and the right ROI with lung-liver boundary may have different contrasts for different patients and with different tracer types, MI measurements of two ROIs at all gates were ranked separately. The ROI with largest minimum-maximum difference was used for phase determination, and the gate with maximum MI in this ROI was the respiratory phase of CT image. The automatically determined phase was validated with visual inspection of PET and CT images. Third, gated PET reconstructions generated with the modified attenuation map were used for motion vector fields (MVF) estimation. MVFs from every other gate to the reference gate were estimated using a B-spline based multi-resolution image registration algorithm. The cost function of the algorithm consists of summed square error between two images and smoothness constraint of the MVF. Fourth, the CT image was transformed to every other frame using estimated MVFs to obtain attenuation maps for all respiratory frames. Finally, image reconstruction of gated PET data was repeated using phase-matched gated PET-CT data pairs. Results: This method was applied to clinical datasets using both 18F-FDG and 68Ga -DOTA-NOC tracer acquired by a PET/CT scanner featuring a 128-ring CT system and a high-resolution PET system. Between the two ROIs, the one with higher organ contrast and motion amplitude proved more useful in phase determination of CT image. The proposed method successfully identified respiratory phase of the CT image and generated phase-matched CT images for all respiratory phases of PET data. Reconstructed PET data using the proposed method showed much reduced mismatch artifacts. Conclusions: In this study, we proposed an automatic mismatch correction method for free-breathing PET/CT datasets. Mutual information was used as similarity measurement to identify respiratory phases of the free-breathing CT image based on segmented ROI. Attenuation map of each respiratory gate was created by transforming the only available attenuation map with MVFs estimated from gated PET images. Reduction of activity-attenuation mismatch was observed from clinical data processed with the proposed method.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.