Abstract
To develop a magnetic resonance (MR)-based method for estimation of continuous linear attenuation coefficients (LACs) in positron emission tomography (PET) using a physical compartmental model and ultrashort echo time (UTE)/multi-echo Dixon (mUTE) acquisitions. We propose a three-dimensional (3D) mUTE sequence to acquire signals from water, fat, and short T2 components (e.g., bones) simultaneously in a single acquisition. The proposed mUTE sequence integrates 3D UTE with multi-echo Dixon acquisitions and uses sparse radial trajectories to accelerate imaging speed. Errors in the radial k-space trajectories are measured using a special k-space trajectory mapping sequence and corrected for image reconstruction. A physical compartmental model is used to fit the measured multi-echo MR signals to obtain fractions of water, fat, and bone components for each voxel, which are then used to estimate the continuous LAC map for PET attenuation correction. The performance of the proposed method was evaluated via phantom and in vivo human studies, using LACs from computed tomography (CT) as reference. Compared to Dixon- and atlas-based MRAC methods, the proposed method yielded PET images with higher correlation and similarity in relation to the reference. The relative absolute errors of PET activity values reconstructed by the proposed method were below 5% in all of the four lobes (frontal, temporal, parietal, and occipital), cerebellum, whole white matter, and gray matter regions across all subjects (n=6). The proposed mUTE method can generate subject-specific, continuous LAC map for PET attenuation correction in PET/MR.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.