The increase in lactate (L) and pyruvate (P) content of arterial blood during experimental and clinical shock states and the extent to which such increases serve as measures of oxygen deficit and irreversible injury were investigated on an empirical basis. A standardized method for production of hemorrhagic shock in the Wistar rat was employed. During a 4-hour bleeding period, oxygen consumption of the rat was reduced to approximately 40% of control value, pH was reduced from 7.39 to 7.08, and a concurrent increase in L from 0.80 to 6.06 m m and in P from 0.07 to 0.18 m m were observed. Cumulative oxygen debt correlated with log L (r = 0.50; P < 0.0005) and both were significantly related to survival. Correlation of cumulative oxygen debt and survival, both with P and with computed values of the lactate pyruvate ratio (L/P) and excess lactate (XL), were of no higher magnitude. Partial correlation analysis demonstrated that neither the measurement of P nor the computation of L/P or XL improved predictability. In 142 patients who presented with clinical manifestation of circulatory shock and of whom 62 survived and 80 died, the best empirical discrimination between survivors and those who died was provided by measurement of L, which failed only 11% of the time. This was confirmed by discriminant function analysis in which the percentage probability of misclassification based on L was 12% whereas this probability increased to 21% with L/P and 19% with XL. The combination of XL and L/P with L failed to improve discrimination. In this series of patients, L served as a sensitive predictor; as L increased from 2.1 to 8.0 m m , the estimated probability of survival decreased from 90 to 10%. These studies corroborate that L alone serves as a reliable indicator, but neither the measurement of P nor the computation of L/P or XL was shown to improve either the reliability of L as a measure of cumulative oxygen debt or its value as a prognosticator of survival during shock states.
Read full abstract