Abstract

High-precision road detection from very high resolution (VHR) remote sensing images has broad application value. However, the most advanced deep learning based methods often fail to identify roads when there is a distribution discrepancy between the training samples and test samples, due to their limited generalization ability. In this paper, to address this problem, an open-source data-driven domain-specific representation (OSM-DOER) framework is proposed for cross-domain road detection. On the one hand, as the spatial structure information of the source and target domains is similar, but the texture information is different, the domain-specific representation (DOER) framework is proposed, which not only aligns the distributions of the spatial structure information, but also learns the domain-specific texture information. Furthermore, in order to enhance the representation of the target domain data distribution, open-source and freely available OpenStreetMap (OSM) road centerline data are utilized to generate target domain samples, which are then used in the network training as the supervised information for the target domain. Finally, to verify the superiority of the proposed OSM-DOER framework, we conducted extensive experiments with the public SpaceNet and DeepGlobe road datasets, and large-scale road datasets from Birmingham in the UK and Shanghai in China. The experimental results demonstrate that the proposed OSM-DOER framework shows obvious advantages over the mainstream road detection methods, and the use of OSM road centerline data has great potential for the road detection task.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.