Abstract

Abstract. Up until now, erosivity required for soil loss predictions has been mainly estimated from rain gauge data at point scale and then spatially interpolated to erosivity maps. Contiguous rain data from weather radar measurements, satellites, cellular communication networks and other sources are now available, but they differ in measurement method and temporal and spatial scale from data at point scale. We determined how the intensity threshold of erosive rains has to be modified and which scaling factors have to be applied to account for the differences in method and scales. Furthermore, a positional effect quantifies heterogeneity of erosivity within 1 km2, which presently is the highest resolution of freely available gauge-adjusted radar rain data. These effects were analysed using several large data sets with a total of approximately 2×106 erosive events (e.g. records of 115 rain gauges for 16 years distributed across Germany and radar rain data for the same locations and events). With decreasing temporal resolution, peak intensities decreased and the intensity threshold was met less often. This became especially pronounced when time increments became larger than 30 min. With decreasing spatial resolution, intensity peaks were also reduced because additionally large areas without erosive rain were included within one pixel. This was due to the steep spatial gradients in erosivity. Erosivity of single events could be zero or more than twice the mean annual sum within a distance of less than 1 km. We conclude that the resulting large positional effect requires use of contiguous rain data, even over distances of less than 1 km, but at the same time contiguously measured radar data cannot be resolved to point scale. The temporal scale is easier to consider, but with time increments larger than 30 min the loss of information increases considerably. We provide functions to account for temporal scale (from 1 to 120 min) and spatial scale (from rain gauge to pixels of 18 km width) that can be applied to rain gauge data of low temporal resolution and to contiguous rain data.

Highlights

  • Prediction of rain-induced soil erosion using models like the Universal Soil Loss Equation (USLE) requires quantification of the potential of rain to cause soil detachment and transport

  • With 17 rain gauges operating at 1 min resolution, 4599 erosive events were determined in 16 years

  • The decrease was less steep below a temporal resolution of 30 min than above: min(Imax30) = −0.59τ 0.5 + 13.23 for τ ≤ 30 min, (6a) min(Imax30) = 147τ −0.79 for τ > 30 min. (6b)

Read more

Summary

Introduction

Prediction of rain-induced soil erosion using models like the Universal Soil Loss Equation (USLE) requires quantification of the potential of rain to cause soil detachment and transport. This potential is called rainfall erosivity and is typically obtained from point rainfall measurements using rain gauges. The characteristic relation between erosivity and rain depth of the same period was termed erosivity density and used in RUSLE2 (Dabney et al, 2012; USDA, 2013). It is recommended for areas with poor data availability (Nearing et al, 2017). Several countries provide rain-gauge-adjusted radar data products with spatial resolutions of, for example, 1×1 km

Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call