Abstract

Abstract. Based on the fifth phase of the Coupled Model Intercomparison Project (CMIP5)-generation previous Institut Pierre Simon Laplace (IPSL) Earth system model, we designed a new version, IPSL-CM5A2, aiming at running multi-millennial simulations typical of deep-time paleoclimate studies. Three priorities were followed during the setup of the model: (1) improving the overall model computing performance, (2) overcoming a persistent cold bias depicted in the previous model generation and (3) making the model able to handle the specific continental configurations of the geological past. These developments include the integration of hybrid parallelization Message Passing Interface – Open Multi-Processing (MPI-OpenMP) in the atmospheric model of the Laboratoire de Météorologie Dynamique (LMDZ), the use of a new library to perform parallel asynchronous input/output by using computing cores as “I/O servers” and the use of a parallel coupling library between the ocean and the atmospheric components. The model, which runs with an atmospheric resolution of 3.75∘×1.875∘ and 2 to 0.5∘ in the ocean, can now simulate ∼100 years per day, opening new possibilities towards the production of multi-millennial simulations with a full Earth system model. The tuning strategy employed to overcome a persistent cold bias is detailed. The confrontation of a historical simulation to climatological observations shows overall improved ocean meridional overturning circulation, marine productivity and latitudinal position of zonal wind patterns. We also present the numerous steps required to run IPSL-CM5A2 for deep-time paleoclimates through a preliminary case study for the Cretaceous. Namely, specific work on the ocean model grid was required to run the model for specific continental configurations in which continents are relocated according to past paleogeographic reconstructions. By briefly discussing the spin-up of such a simulation, we elaborate on the requirements and challenges awaiting paleoclimate modeling in the next years, namely finding the best trade-off between the level of description of the processes and the computing cost on supercomputers.

Highlights

  • IntroductionDespite the rise of high-performance computing (HPC; all acronyms except model names are defined in Appendix D), the ever-growing complexity and resolution of general circulation models (and subsequent Earth system models; ESMs) have restricted their use to centennial integrations for future climate projections, or short “snapshot” experiments for paleoclimates, with durations ranging from decades to a few thousand years

  • Despite the rise of high-performance computing (HPC; all acronyms except model names are defined in Appendix D), the ever-growing complexity and resolution of general circulation models have restricted their use to centennial integrations for future climate projections, or short “snapshot” experiments for paleoclimates, with durations ranging from decades to a few thousand years

  • We present the computing performance of the new model, first on CURIE compared to Institut Pierre Simon Laplace (IPSL)-CM5A as it was originally conducted, on JOLIOT-CURIE, the Très Grand Centre de Calcul (TGCC) machine that will be used in the years by the IPSL climate model community and on which we conducted a set of scaling experiments

Read more

Summary

Introduction

Despite the rise of high-performance computing (HPC; all acronyms except model names are defined in Appendix D), the ever-growing complexity and resolution of general circulation models (and subsequent Earth system models; ESMs) have restricted their use to centennial integrations for future climate projections, or short “snapshot” experiments for paleoclimates, with durations ranging from decades to a few thousand years. Millennial-length simulations are useful to project very long-term future climatic trends and to properly quantify equilibrium climate sensitivity of models (Rugenstein et al, 2020). In paleoclimate studies, such long simulations are mandatory to either (i) reach a fully equilibrated deep ocean when initialized from idealized thermohaline conditions (a typical procedure in deeptime, pre-Quaternary – i.e., older than 2.6-million-year paleoclimate simulations) or (ii) address multi-millennial transient climate evolution such as glacial–interglacial cycles of the Quaternary period (He, 2011). Several strategies have been set up to overcome the issue of the computing cost of long integrations They involve both the development of dedicated models and peculiar experimental designs.

Objectives
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call