Abstract

AbstractCorrectly calculating the timing and amount of crop irrigation is crucial for capturing irrigation effects on surface water and energy budgets and land‐atmosphere interactions. This study incorporated a dynamic irrigation scheme into the Noah with multiparameterization land surface model and investigated three methods of determining crop growing season length by agriculture management data. The irrigation scheme was assessed at field scales using observations from two contrasting (irrigated and rainfed) AmeriFlux sites near Mead, Nebraska. Results show that crop‐specific growing‐season length helped capture the first application timing and total irrigation amount, especially for soybeans. With a calibrated soil‐moisture triggering threshold (IRR_CRI), using planting and harvesting dates alone could reasonably predict the first application for maize. For soybeans, additional constraints on growing season were required to correct an early bias in the first modeled application. Realistic leaf area index input was essential for identifying the leaf area index‐based growing season. When transitioning from field to regional scales, the county‐level calibrated IRR_CRI helped mitigate overestimated (underestimated) total irrigation amount in southeastern Nebraska (lower Mississippi River Basin). In these two heavily irrigated regions, irrigation produced a cooling effect of 0.8–1.4 K, a moistening effect of 1.2–2.4 g/kg, a reduction in sensible heat flux by 60–105 W/m2, and an increase in latent heat flux by 75–120 W/m2. Most of irrigation water was used to increase soil moisture and evaporation, rather than runoff. Lacking regional‐scale irrigation timing and crop‐specific parameters makes transferring the evaluation and parameter‐constraint methods from field to regional scales difficult.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call