Abstract

Introduction: Continuous renal replacement therapy using regional citrate anticoagulation is commonly used as a modality of organ support in the critically ill population. Currently, citrate accumulation or toxicity is assessed using surrogate markers, notably the uncorrected total-to-ionized calcium ration. The accuracy and utility of this method have been questioned. Objectives/Aims: The aim of this study was to compare the surrogate markers used for assessing citrate accumulation or toxicity using the measurement of plasma citrate as the gold standard. Methods: Blood was sampled from 20 patients before, during, and after episodes of filtration with citrate concentration measured using spectrophotometry. Demographic and other clinical and biochemical data were also collected. According to protocol, a 15 mmol/L solution of trisodium citrate was used as the prefilter anticoagulant. Results were analyzed using STATA (v16.0) and presented as mean (SD), median (IQR), or simple proportion. Univariate linear regression using citrate concentration as the dependent variable was performed with all surrogate markers. Results: Twenty patients (17 males) were enrolled in the study with a mean (SD) age of 62.7 (9.9) years. The uncorrected calcium ratio had the best fit to the citrate data with an R2 value of 0.39. The albumin-corrected calcium ratio, pH, anion gap (AG), albumin-corrected AG, standard base excess, and strong ion gap all had R2 values less than 0.05. Conclusion(s): In the absence of direct measurement of citrate concentration, uncorrected total-to-ionized calcium ratio is superior to other surrogate markers, though not ideal, in assessing citrate accumulation or toxicity.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call