Abstract

Although the real timing and flow rates used for crop irrigation are controlled at the scale of individual plots by the irrigator, they are not generally known by the farm upper management. This information is nevertheless essential, not only to compute the water balance of irrigated plots and to schedule irrigation, but also for the management of water resources at regional scales. The aim of the present study was to detect irrigation timing using time series of surface soil moisture (SSM) derived from Sentinel-1 radar observations. The method consisted of assessing the direction of change of surface soil moisture (SSM) between observations and a water balance model, and to use thresholds to be calibrated. The performance of the approach was assessed on the F-score quantifying the accuracy of the irrigation event detections and ranging from 0 (none of the irrigation timing is correct) to 100 (perfect irrigation detection). The study focused on five irrigated and one rainfed plot of maize in South-West France, where the approach was tested using in situ measurements and surface soil moisture (SSM) maps derived from Sentinel-1 radar data. The use of in situ data showed that (1) irrigation timing was detected with a good accuracy (F-score in the range (80–83) for all plots) and (2) the optimal revisit time between two SSM observations was 2–4 days. The higher uncertainties of microwave SSM products, especially when the crop is well developed (normalized difference of vegetation index (NDVI) > 0.7), degraded the score (F-score = 69), but various possibilities of improvement were discussed. This paper opens perspectives for the irrigation detection at the plot scale over large areas and thus for the improvement of irrigation water management.

Highlights

  • Optimal irrigation relies on an accurate knowledge of plant water consumption, the flow of water, and soil moisture dynamics throughout the growing season

  • As revealed by this analysis, various circumstances could lead to incorrect detections: if a second irrigation event occurs within the time interval separating recorded events, only one event was accounted for, meaning that with this dataset, the highest achievable accuracy was only 85% (24/29)

  • Only the farmer knows when he has irrigated. Knowledge of these events is fundamental to carry out the water balance of a farm plot

Read more

Summary

Introduction

Optimal irrigation relies on an accurate knowledge of plant water consumption, the flow of water, and soil moisture dynamics throughout the growing season. Irrigation scheduling is generally based on a farmer decision-making process, which is used to determine when and how much water should be provided, to optimize the development and yield of the irrigated crops [3,4,5]. Several methods can be applied, to assist the decision-making process used to establish an irrigation schedule such as: (1) water balance simulation models, (2) the monitoring of soil water content or matrix water potential and (3) a combination of both of these. The farmers ability to follow advice, in terms of dates and quantities of water, can be restricted by several factors: an irrigation infrastructure that imposes the timing of irrigation events, an irrigation method that imposes the volumes of water, the time available to implement scheduling changes, etc

Objectives
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call