Abstract

Apparent increases in sedimentation rates during the past 5 Ma have been inferred at sites around the globe to document increased terrestrial erosion rates, but direct erosion rate records spanning this period are sparse. Modern and paleo-erosion rates for a small alpine catchment (3108 m above sea level) in the Southern Rocky Mountains are measured using the cosmogenic radionuclides (CRNs) 10Be and 26Al in cave sediment, bedrock on the overlying landscape surface, and coarse bedload in a modern fluvial drainage. The unique setting of the Marble Mountain cave system allows the inherited erosion rates to be interpreted as basin-averaged erosion rates, resulting in the first CRN-based erosion rate record from the Rocky Mountains spanning 5 Myr. Pliocene erosion rates, derived from the oldest cave sample (4.9 ± 0.4 Ma), for the landscape above the cave are 4.9 ± 1.1 m Myr − 1 . Mid Pleistocene erosion rates are nearly an order of magnitude higher (33.1 ± 2.7 to 41.3 ± 3.9 m Myr − 1 ), and modern erosion rates are similar; due to the effects of snow shielding, these erosion rate estimates are likely higher than actual rates by 10–15%. The most likely explanation for this dramatic increase in erosion rates, which likely occurred shortly before 1.2 Ma, is an increase in the effectiveness of periglacial weathering processes at high elevations related to a cooler and wetter climate during the Pleistocene, providing support for the hypothesis that changes in late Cenozoic climate are responsible for increased continental erosion.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call