Abstract

MRI can produce quantitative liver fat fraction (FF) maps noninvasively, which can help to improve diagnoses of fatty liver diseases. However, most sequences acquire several two-dimensional (2D) slices during one or more breath-holds, which may be difficult for patients with limited breath-holding capacity. A whole-liver 3D FF map could also be obtained in a single acquisition by applying a reliable breathing-motion correction method. Several correction techniques are available for 3D imaging, but they use external devices, interrupt acquisition, or jeopardize the spatial resolution. To overcome these issues, a proof-of-concept study introducing a self-navigated 3D three-point Dixon sequence is presented here. A respiratory self-gating strategy acquiring a center k-space profile was integrated into a three-point Dixon sequence. We obtained 3D FF maps from a water-fat emulsions phantom and fifteen volunteers. This sequence was compared with multi-2D breath-hold and 3D free-breathing approaches. Our 3D three-point Dixon self-navigated sequence could correct for respiratory-motion artifacts and provided more precise FF measurements than breath-hold multi-2D and 3D free-breathing techniques. Our 3D respiratory self-gating fat quantification sequence could correct for respiratory motion artifacts and yield more-precise FF measurements. Magn Reson Med 76:1400-1409, 2016. © 2015 International Society for Magnetic Resonance in Medicine.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.