Water is a key factor in global food security, which is critical to agriculture. The use of mathematical models is a strategy for managing water use in agriculture, and it is an effective way to predict the effect of irrigation management on crop yields if the accuracy of these models is demonstrated. The CROPWAT and SALTMED models were tested in this study, with water quantities applied to surface and drip irrigation (SI and DI) systems to estimate irrigation scheduling and wheat yield. For this purpose, field experiments were conducted for two consecutive years to study the effects of irrigation water levels of 80%, 100%, and 120% crop evapotranspiration (I80, I100, and I120) on the yield and water productivity (WP) of wheat in SI and DI systems. Irrigation treatments affected yield components such as plant height, number of spikes, spike length, and 1000-kernel weight, though they were not statistically different in some cases. In the I80 treatment, the biological yield was 12.8% and 8.5% lower than in the I100 and I120 treatments, respectively. I100 treatment under DI resulted in the highest grain yield of a wheat crop. When DI was applied, there was a maximum (22.78%) decrease in grain yield in the I80 treatment. The SI system was more water-consuming than the DI system was, which was reflected in the WP. When compared with the WP of the I80 and I100 treatments, the WP was significantly lower (p < 0.05) in the I120 treatment in the SI or DI system. To evaluate irrigation scheduling and estimate wheat yield response, the CROPWAT model was used. Since the CROPWAT model showed that increasing irrigation water levels under SI for water stress coefficient (Ks) values less than one increased deep percolation (DP), the I120 treatment had the highest DP value (556.15 mm on average), followed by the I100 and I80 treatments. In DI, I100 and I120 treatments had Ks values equal to one throughout the growing seasons, whereas the I80 treatment had Ks values less than one during wheat’s mid- and late-season stages. The I100 and I80 treatments with DI gave lower DP values of 93.4% and 74.3% compared with that of the I120 treatment (on average, 97.05 mm). The I120 treatment had the lowest irrigation schedule efficiency in both irrigation systems, followed by the I100 and I80 treatments. In both seasons, irrigation schedule deficiencies were highest in the I80 treatment with DI (on average, 12.35%). The I80 treatment with DI had a significant yield reduction (on average, 21.9%) in both seasons, while the irrigation level treatments with SI had nearly the same reductions. The SALTMED model is an integrated model that considers irrigation systems, soil types, crops, and water application strategies to simulate soil water content (SWC) and crop yield. The SALTMED model was calibrated and validated based on the experimental data under irrigation levels across irrigation systems. The accuracy of the model was assessed by the coefficients of correlation (R), root mean square errors (RMSE), mean absolute errors (MAE), and mean absolute relative error (MARE). When simulating SWC, the SALTMED models’ R values, on average, were 0.89 and 0.84, RMSE values were 0.018 and 0.019, MAE values were 0.015 and 0.016, and MARE values were 8.917 and 9.133%, respectively, during the calibration and validation periods. When simulating crop yield, relative errors (RE) for the SALTMED model varied between −0.11 and 24.37% for biological yield and 0.1 and 19.18% for grain yield during the calibration period, while in the validation period, RE was in the range of 3.8–29.81% and 2.02–25.41%, respectively. The SALTMED model performed well when simulating wheat yield with different water irrigation levels under SI or DI.
Read full abstract