Abstract

While several processes have been identified to explain the decrease in atmospheric CO2 during glaciations, a better quantification of the contribution of each of these processes is needed. For example, enhanced aeolian iron input into the ocean during glacial times has been suggested to drive a 5 to 28 ppm atmospheric CO2 decrease. Here, we constrain this contribution by performing a set of sensitivity experiments with different aeolian iron input patterns and iron solubility factors under boundary conditions corresponding to 70 thousand years before present (70 ka BP), a time period characterised by the first observed peak in glacial dust flux. We show that the decrease in CO2 as a function of the Southern Ocean iron input follows an exponential decay relationship. This exponential decay response arises due to the saturation of the biological pump efficiency and levels out at ∼21 ppm in our simulations. Using a best estimate of surface water iron solubility between 3 and 5 %, a ∼9 to 11 ppm CO2 decrease is simulated at 70 ka BP, while a plausible range of CO2 draw-down between 4 to 16 ppm is obtained using the wider but possible range of 1 to 10 %. This would account for ∼12–50 % of the reconstructed decrease in atmospheric CO2 (∼32 ppm) between 71 and 64 ka BP. We further find that in our simulations the decrease in atmospheric CO2 concentrations is solely driven by iron fluxes south of the Antarctic polar front, while iron fertilization elsewhere plays a negligible role.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call