Abstract

Recent developments in deep learning technology have boosted the performance of dense stereo reconstruction. However, the state-of-the-art deep learning-based stereo matching methods are mainly trained using close-range synthetic images. Consequently, the application of these methods in aerial photogrammetry and remote sensing is currently far from straightforward. In this paper, we propose a new disparity estimation network for stereo matching and investigate its generalization abilities in regard to aerial images. First, we propose an end-to-end deep learning network for stereo matching, regularized by disparity gradients, which includes a residual cost volume and a reconstruction error volume in a refinement module, and multiple losses. In order to investigate the influence of the multiple losses, a comprehensive analysis is presented. Second, based on this network trained with synthetic close-range data, we propose a new pipeline for matching high-resolution aerial imagery. The experimental results show that the proposed network improves the disparity accuracy by up to 40% in terms of errors larger than 1 px compared to results when not including the refinement network, especially in areas containing detailed small objects. In addition, in qualitative and quantitative experiments, we are able to show that our model, pre-trained on a synthetic stereo dataset, achieves very competitive sub-pixel geometric accuracy on aerial images. These results confirm that the domain gap between synthetic close-range and real aerial images can be satisfactorily bridged using the proposed new deep learning method for dense image matching.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.