Abstract

Cloud cover hinders accurate and timely monitoring of urban land cover (ULC). The combination of synthetic aperture radar (SAR) and optical data without cloud contamination has demonstrated promising performance in previous research. However, ULC studies on cloud-prone areas are scarce despite the inevitability of cloud cover, especially in the tropics and subtropics. This study proposes a novel weighted cloud dictionary learning method (WCDL) for fusing optical and SAR data for the ULC classification in cloud-prone areas. We innovatively propose a cloud probability weighting model and a pixel-wise cloud dictionary learning method that take the interference disparities at various cloud probability levels into account to mitigate cloud interference. Experiments reveal that the overall accuracy (OA) of fused data rises by more than 6% and 20% compared to single SAR and optical data, respectively. This method considerably improved by 3% in OA compared with other methods that directly stitch optical and SAR data together regardless of cloud interference. It improves almost all land covers producer's accuracy (PA) and user's accuracy (UA) by up to 9%. Ablation studies further show the cloud probability weighting model improves the OA of all classifiers by up to 5%. And the pixel-wise cloud dictionary learning model improves by more than 2% in OA for all cloud conditions, and the UA and PA are enhanced by up to 9% and 10%. The proposed WCDL method will serve as a reference for fusing cloud-contaminated optical and SAR data and timely, continuous, and accurate land surface monitoring in cloudy areas.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call