Abstract

BackgroundHemorrhage remains the leading cause of death following traumatic injury in both civilian and military settings. Heart rate variability (HRV) and complexity (HRC) have been recognized as potential “new vital signs” for monitoring trauma patients, although their benefit for triage decision support remains unclear. Another new paradigm, the compensatory reserve measurement (CRM), represents the integration of all mechanisms responsible for compensation during blood loss, designed to identify current physiologic status by estimating the progression toward hemodynamic decompensation. We hypothesized that CRM would provide greater sensitivity and specificity to detect progressive reductions in central circulating blood volume and onset of decompensation as compared to measurements of HRV and HRC.MethodsContinuous, noninvasive measurements of CRM and electrocardiogram (ECG) were made on 101 healthy volunteers (59 males, 42 females; mean±SD age 28±8 years) during stepwise progressive central hypovolemia to the point of decompensation. Lower body negative pressure (LBNP) was used to simulate hemorrhage by inducing the onset and progression of central hypovolemia. Time‐domain measures of HRV were taken from ECG signal data including R‐to‐R Interval Standard Deviation in milliseconds (ms), Root Mean Square Standard Deviation, percentage of RRI that vary by at least 50 ms, and amplitude measures of High Frequency Oscillations and Low Frequency Oscillations (CDMLF) obtained from the complex demodulation method. Measures of HRC included Sample Entropy, Fractal Dimension by Dispersion Analysis, Stationarity and Detrended Fluctuation Analysis. Data were analyzed using generalized estimating equations with compound symmetry covariance structures for longitudinal correlated data analysis of continuous variables, the results of which were used to report least‐square means with 95% confidence intervals at each LBNP level.ResultsCRM demonstrated superior ROC AUC (0.93) compared with all measures of HRV (≤ 0.84), and HRC (≤ 0.86). Sensitivity and specificity values at the ROC optimal thresholds were greater for CRM (SENS=0.84; SPEC=0.84) than HRV (SENS ≤ 0.78; SPEC ≤ 0.77) and HRC (SENS ≤ 0.79; SPEC ≤ 0.77). With standardized values across all levels of LBNP, CRM had a steeper decline, less variability, and explained a greater proportion of the variation in the data than both HRV and HRC during progressive hypovolemia.ConclusionConsistent with our hypothesis, CRM had greater sensitivity and specificity in detecting progressive reductions in circulating blood volume and decompensation than HRV and HRC. These findings add to the growing body of literature describing the advantages of CRM for detecting reductions in central blood volume. Most importantly, these results provide further support for the potential use of CRM in the triage and monitoring of patients at risk of circulatory shock following blood loss.Support or Funding InformationStudy funding was provided by a grant from the US Army Combat Casualty Care Research Program (D‐009‐2014‐USAISR).This abstract is from the Experimental Biology 2019 Meeting. There is no full text article associated with this abstract published in The FASEB Journal.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call