Abstract

Paleoclimate records are rich sources of information about the past history of the Earth system. Information theory provides a new means for studying these records. We demonstrate that weighted permutation entropy of water-isotope data from the West Antarctica Ice Sheet (WAIS) Divide ice core reveals meaningful climate signals in this record. We find that this measure correlates with accumulation (meters of ice equivalent per year) and may record the influence of geothermal heating effects in the deepest parts of the core. Dansgaard-Oeschger and Antarctic Isotope Maxima events, however, do not appear to leave strong signatures in the information record, suggesting that these abrupt warming events may actually be predictable features of the climate's dynamics. While the potential power of information theory in paleoclimatology is significant, the associated methods require well-dated and high-resolution data. The WAIS Divide core is the first paleoclimate record that can support this kind of analysis. As more high-resolution records become available, information theory could become a powerful forensic tool in paleoclimate science.

Highlights

  • The Earth contains a vast archive of geochemical information about the past and present states of the climate system

  • We demonstrate that weighted permutation entropy of water-isotope data from the West Antarctica Ice Sheet (WAIS) Divide ice core reveals meaningful climate signals in this record

  • The underlying premise of this paper is that information theory can be useful in understanding the climate signals that are captured in ice-core records

Read more

Summary

Introduction

The Earth contains a vast archive of geochemical information about the past and present states of the climate system. For example, provide high-resolution proxies for hydrologic cycle variability, greenhouse gases, temperature, and dust distribution, among other things. While a great deal of sophisticated and creative work has been done on these records, very little of that work has leveraged the power of information theory. The Shannon entropy rate, for instance, measures the average rate at which new information—unrelated to anything in the past—is produced by the system that generated a time series. If that rate is very low, the current observation contains a signi cant amount of information about the past. If the Shannon entropy rate is very high, most of the information in each observation is completely new: i.e., the past tells you little or nothing about the future

Objectives
Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.