Abstract. While several processes have been identified to explain the decrease in atmospheric CO2 during glaciations, a better quantification of the contribution of each of these processes is needed. For example, enhanced aeolian iron input into the ocean during glacial times has been suggested to drive a 5 to 28 ppm atmospheric CO2 decrease. Here, we constrain this contribution by performing a set of sensitivity experiments with different aeolian iron input patterns and iron solubility factors under boundary conditions corresponding to 70 000 years before present (70 ka), a time period characterised by the first observed peak in glacial dust flux. We show that the decrease in CO2 as a function of Southern Ocean iron input follows an exponential decay relationship. This exponential decay response arises due to the saturation of the biological pump efficiency and levels out at ∼21 ppm in our simulations. We show that the changes in atmospheric CO2 are more sensitive to the solubility of iron in the ocean than the regional distribution of the iron fluxes. If surface water iron solubility is considered constant through time, we find a CO2 drawdown of ∼4 to ∼8 ppm. However, there is evidence that iron solubility was higher during glacial times. A best estimate of solubility changing from 1 % during interglacials to 3 % to 5 % under glacial conditions yields a ∼9 to 11 ppm CO2 decrease at 70 ka, while a plausible range of CO2 drawdown between 4 to 16 ppm is obtained using the wider but possible range of 1 % to 10 %. This would account for ∼12 %–50 % of the reconstructed decrease in atmospheric CO2 (∼32 ppm) between 71 and 64 ka. We further find that in our simulations the decrease in atmospheric CO2 concentration is solely driven by iron fluxes south of the Antarctic polar front, while iron fertilisation elsewhere plays a negligible role.
Read full abstract