Abstract

Irrigation scheduling is often based around the analogy of a ‘tipping bucket’, and the measurement or prediction of the amount of water stored within the bucket. We compare this conventional approach of scheduling with stopping irrigation when the bucket tips i.e. when infiltrating water moves from an upper to a lower soil layer. Electronic wetting front detectors were used to close a solenoid valve at the time infiltrating water reached a depth of 300mm, when irrigating a lucerne crop in a rain-out shelter. Four different ways of using information from the position of the wetting front were compared with scheduling irrigation from soil water measurements made by a neutron probe or calculated by a soil-crop model. Automatically closing a solenoid valve at the time the upper bucket tipped was a successful approach, but only when the correct irrigation interval was selected. If the irrigation interval was too short, water draining from the soil layer above the detector resulted in drainage. Scheduling from wetting front detectors placed at 600mm depth was unsuccessful because of the difficulty in detecting weak wetting fronts at this depth. The commonly accepted method of measuring a soil water deficit and refilling the bucket to field capacity was not without limitation. Since the soil drained for many days after irrigation, and well beyond the 48h period typically selected to represent the upper drained limit, drainage and evapotranspiration occurred concurrently.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call