Subsurface drainage is an effective measure for mitigating soil salinization, while few studies investigated the subsurface drainage design criteria in regional scale and its determining method and influencing factors. In this study, a multi-scale numerical model was established using SDR, MODFLOW-LGR and MT3DMS to simulate the regional groundwater and subsurface drainage. After the model was calibrated and validated by real-world data, a total of 1288 scenarios were simulated to evaluate the effect of subsurface drainage simulation method, grid size and study the influence of irrigation quantity, initial groundwater table depth (GWDini) and soil texture on subsurface drainage design criteria. The results show that utilizing the DRN package to simulate subsurface drainage led to substantial errors, with relative root mean square errors exceeding 120 %. The subsurface drainage amount simulated by SDR generally decreased with increasing grid size, because the groundwater level at the midway is averaged by a larger adjacent region. To balance simulation efficiency and accuracy, a 1 m grid size is appropriate for the SDR package in subsurface drainage simulation, with relative errors smaller than 4.8 %. The design criteria depends on the hydraulic conductivity of the first layer aquifer (Ka), followed by irrigation quantities during the winter irrigation (IWP), GWDini and irrigation quantities during the growth period (IGP). On average, for every 10 % reduction in IWP and IGP (i.e., 64 and 30.2 mm), the minimum pipe depth (MPD) increases by approximately 0.08 m and 0.03 m, respectively. For every 1 m reduction in GWDini, the MPD increases by approximately 0.13 m. When Ka is less than 0.5 m/d, there is a significant increase in MPD due to the decrease in Ka, especially at larger pipe spacing. This study provides references for accurate predictions of regional groundwater and salt dynamics, subsurface drainage and the determination of subsurface drainage design criteria.
Read full abstract