Abstract
Approximately 1 billion people worldwide currently inhabit slum areas. The UN Sustainable Development Goal (SDG 11.1) underscores the imperative of upgrading all slums by 2030 to ensure adequate housing for everyone. Geo-locations of slums help local governments with upgrading slums and alleviating urban poverty. Remote sensing (RS) technology, with its excellent Earth observation capabilities, can play an important role in slum mapping. Deep learning (DL)-based RS information extraction methods have attracted a lot of attention. Currently, DL-based slum mapping studies typically uses three optical bands to adapt to existing models, neglecting essential geo-scientific information, such as spectral and textural characteristics, which are beneficial for slum mapping. Inspired by the geoscience-aware DL paradigm, we propose the Geoscience-Aware Network for slum mapping (GASlumNet), aiming to improve slum mapping accuracies via incorporating the DL model with geoscientific prior knowledge. GASlumNet employs a two-stream architecture, combining ConvNeXt and UNet. One stream concentrates on optical feature representation, while the other emphasizes geo-scientific features. Further, the feature-level and decision-level fusion mechanisms are applied to optimize deep features and enhance model performance. We used Jilin-1 Spectrum 01 and Sentinel-2 images to perform experiments in Mumbai, India. The results demonstrate that GASlumNet achieves higher slum mapping accuracy than the comparison models, with an intersection over union (IoU) of 58.41%. Specifically, GASlumNet improves the IoU by 4.60~5.97% over the baseline models, i.e., UNet and ConvNeXt-UNet, which exclusively utilize optical bands. Furthermore, GASlumNet enhances the IoU by 10.97% compared to FuseNet, a model that combines optical bands and geo-scientific features. Our method presents a new technical solution to achieve accurate slum mapping, offering potential benefits for regional and global slum mapping and upgrading initiatives.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.