Abstract

The extent to which heat stress compromises blood pressure control is quite variable between individuals; some are intolerant to a simulated hemorrhagic challenge while heat stressed, with others being relatively tolerant. The objective of this study was to identify if the magnitude of reduction in central venous pressure (CVP) during heat stress explains the variability in tolerance to a hemorrhagic challenge. Simulated hemorrhage was imposed via lower body negative pressure to pre‐syncope on 20 subjects during passive heating. Tolerance was quantified with a cumulative stress index (CSI), and the 8 subjects with the lowest and the 8 with the highest CSI were categorized as low‐ and high‐tolerant, respectively. The increase in core body temperature was similar between groups (1.45 ± 0.11 versus 1.41 ± 0.09°C, P = 0.44), and by design CSI was greater for the high tolerant group (439 ± 183 vs 86 ± 39 CSI units; P < 0.001). Although heat stress decreased CVP in the low (6.6 ± 2.6 to 0.8 ± 2.1 mmHg) and the high (7.6 ± 2.0 to 2.0 ± 3.0 mmHg) tolerant groups (P < 0.001), the magnitude of this reduction was not different between groups (low: 5.9 ± 1.4; high: 5.6 ± 2.1 mmHg; P=0.53). Contrary to our hypothesis, differences in blood pressure control during simulated hemorrhage are not related to differences in the magnitude of the heat stress‐induced reduction in CVP.Supported by NIH Grants HL61388 & HL84072 & TREi

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call