Abstract
Rill erosion can be a large portion of the total erosion in disturbed forests, but measurements of the runoff and erosion at the rill scale are uncommon. Simulated rill erosion experiments were conducted in two forested areas in the northwestern United States on slopes ranging from 18 to 79%. We compared runoff rates, runoff velocities, and sediment flux rates from natural (undisturbed) forests and in forests either burned at low soil burn severity (10 months or 2 weeks post‐fire), high soil burn severity, or subject to skidding of felled logs. The runoff rates and velocities in the natural sites (2.7 L min−1 and 0.016 m s−1, respectively) were lower than those in all the disturbed sites (12 to 21 L min−1 and 0.19 to 0.31 m s−1, respectively), except for the 10‐month old low soil burn severity site where the velocity (0.073 m s−1) was indistinguishable from the natural sites. The mean sediment flux rate in the natural sites also was very small (1.3 × 10−5 kg s−1) as compared to the rates in the disturbed areas (2.5 × 10−4 to 0.011 kg s−1). The hillslope gradient did not affect the runoff or sediment responses. The sediment flux rates generally were greater in the initial stage of each inflow period than in the steady state condition, but there was no corresponding transient effect in runoff rates. Rill erosion modeling implications based on these data are presented in part 2 of this study.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.