Abstract

<strong class="journal-contentHeaderColor">Abstract.</strong> While several processes have been identified to explain the decrease in atmospheric CO<sub>2</sub> during glaciations, a better quantification of the contribution of each of these processes is needed. For example, enhanced aeolian iron input into the ocean during glacial times has been suggested to drive a 5 to 28 ppm atmospheric CO<sub>2</sub> decrease. Here, we constrain this contribution by performing a set of sensitivity experiments with different aeolian iron input patterns and iron solubility factors under boundary conditions corresponding to 70 thousand years before present (70 ka BP), a time period characterised by the first observed peak in glacial dust flux. We show that the decrease in CO<sub>2</sub> as a function of the Southern Ocean iron input follows an exponential decay relationship. This exponential decay response arises due to the saturation of the biological pump efficiency and levels out at &sim;21 ppm in our simulations. Using a best estimate of surface water iron solubility between 3 and 5 %, a &sim;9 to 11 ppm CO<sub>2</sub> decrease is simulated at 70 ka BP, while a plausible range of CO<sub>2</sub> draw-down between 4 to 16 ppm is obtained using the wider but possible range of 1 to 10 %. This would account for &sim;12&ndash;50 % of the reconstructed decrease in atmospheric CO<sub>2</sub> (&sim;32 ppm) between 71 and 64 ka BP. We further find that in our simulations the decrease in atmospheric CO<sub>2</sub> concentrations is solely driven by iron fluxes south of the Antarctic polar front, while iron fertilization elsewhere plays a negligible role.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call