Abstract

Surface water area estimation is essential for understanding global environmental dynamics, yet it presents significant challenges, particularly when dealing with small water bodies like ponds and narrow width rivers. Surface water areas for these small bodies are often inaccurately represented by existing methods due to the spatial resolution limitations in commonly used remote sensing images. This study introduces DeepWaterFraction (DWF), a deep learning approach, to estimate percent surface water area from Landsat mission imagery. DWF is trained with a self-training method, which creates training data by upscaling remote sensing images and water map labels to a lower resolution, enabling the creation of a large-scale, global coverage training dataset. DWF demonstrates superior accuracy in estimating areas for small water bodies compared to several existing methods for surface water area estimation, with a pixel-wise root mean squared error of 14.3 %. Specifically, it reduces error rates by 54.3 % for water bodies with a minimum area of 0.001 km2 and by 22.6 % for those with a minimum area of 0.01 km2. DWF’s application in global river discharge inversion is also explored, showcasing its capability to capture width variations in narrow rivers (<90 m) better than existing methods, and its robustness across environments including wetland, tree covers, and urban areas. Even for wider rivers (>150 m), DWF’s performance remains superior, as its ability to accurately quantify mixed water pixel areas effectively reflects discharge variations when the variation area is small. We find that self-training is an effective strategy for generating extensive global training datasets for water mapping, with a high upscaling factor being critical for ensuring label accuracy. This study presents a step forward in the accurate global mapping of water resources.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.