Root zone soil moisture (RZSM) is crucial for agricultural water management and land surface processes. The 1 km soil water index (SWI) dataset from Copernicus Global Land services, with eight fixed characteristic time lengths (T), requires root zone depth optimization (Topt) and is limited in use due to its low spatial resolution. To estimate RZSM at 100-m resolution, we integrate the depth specificity of SWI and employed random forest (RF) downscaling. Topographic synthetic aperture radar (SAR) and optical datasets were utilized to develop three RF models (RF1: SAR, RF2: optical, RF3: SAR + optical). At the DEMMIN experimental site in northeastern Germany, Topt (in days) varies from 20 to 60 for depths of 10 to 30 cm, increasing to 100 for 40–60 cm. RF3 outperformed other models with 1 km test data. Following residual correction, all high-resolution predictions exhibited strong spatial accuracy (R ≥ 0.94). Both products (1 km and 100 m) agreed well with observed RZSM during summer but overestimated in winter. Mean R between observed RZSM and 1 km (100 m; RF1, RF2, and RF3) SWI ranges from 0.74 (0.67, 0.76, and 0.68) to 0.90 (0.88, 0.81, and 0.82), with the lowest and highest R achieved at 10 cm and 30 cm depths, respectively. The average RMSE using 1 km (100 m; RF1, RF2, and RF3) SWI increased from 2.20 Vol.% (2.28, 2.28, and 2.35) at 30 cm to 3.40 Vol.% (3.50, 3.70, and 3.60) at 60 cm. These negligible accuracy differences underpin the potential of the proposed method to estimate RZSM for precise local applications, e.g., irrigation management.