Abstract

Gadoxetic acid uptake rate (k1 ) obtained from dynamic, contrast-enhanced (DCE) magnetic resonance imaging (MRI) is a promising measure of regional liver function. Clinical exams are typically poorly temporally characterized, as seen in a low temporal resolution (LTR) compared to high temporal resolution (HTR) experimental acquisitions. Meanwhile, clinical demands incentivize shortening these exams. This study develops a neural network-based approach to quantitation of k1 , for increased robustness over current models such as the linearized single-input, two-compartment (LSITC) model. Thirty Liver HTR DCE MRI exams were acquired in 22 patients with at least 16min of postcontrast data sampled at least every 13s. A simple neural network (NN) with four hidden layers was trained on voxel-wise LTR data to predict k1 . Low temporal resolution data were created by subsampling HTR data to contain six time points, replicating the characteristics of clinical LTR data. Both the total length and the placement of points in the training data were varied considerably to encourage robustness to variation. A generative adversarial network (GAN) was used to generate arterial and portal venous inputs for use in data augmentation based on the dual-input, two-compartment, pharmacokinetic model of gadoxetic acid in the liver. The performance of the NN was compared to direct application of LSITC on both LTR and HTR data. The error was assessed when subsampling lengths from 16 to 4min, enabling assessment of robustness to acquisition length. For acquisition lengths of 16min NRMSE (Normalized Root-Mean-Squared Error) in k1 was 0.60, 1.77, and 1.21, for LSITC applied to HTR data, LSITC applied to LTR data, and GAN-augmented NN applied to LTR data, respectively. As the acquisition length was shortened, errors greatly increased for LSITC approaches by several folds. For acquisitions shorter than 12min the GAN-augmented NN approach outperformed the LSITC approach to a statistically significant extent, even with HTR data. The study indicates that data length is significant for LSITC analysis as applied to DCE data for standard temporal sampling, and that machine learning methods, such as the implemented NN, have potential for much greater resilience to shortened acquisition time than directly fitting to the LSITC model.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call