Abstract

Automated building extraction from single, very-high-resolution (VHR) images is still one of the most challenging tasks for urban planning, estimating population, understanding urban dynamics, and many other applications. The complexities of building objects have caused the images of buildings to be oversegmented into multiple segments in the object-based image analysis (OBIA) method. Selecting the appropriate segmentation scale parameter is a major challenge in OBIA that influences the discriminative features extraction, especially for building objects. Furthermore, transferability of OBIA method is another challenge. Presently, convolutional neural networks (CNNs) are a well-understood tool for images scene classification. However, scene classification based on CNNs is still difficult due to the scale variation of the objects in VHR images. To meet these challenges, we propose a novel object-based deep CNN (OCNN) framework for VHR images. The datasets used for testing were Vaihingen (Germany) aerial images and a Tunis Worldview-2 (WV2) satellite imagery. Experimental results prove that our framework is extensible to different types of the image with the same sensor or another sensor (for example WV2) with once-fine-tuning. In addition, our framework extracts the different types of building with respect to size, color, material, spectral similarity to roads, and complex backgrounds. Quantitative evaluation at the object level demonstrated that the proposed framework could yield promising results (average precision 0.88, recall 0.92, quality 0.82, F-score 0.90, overall accuracy 0.95, and Kappa coefficient 0.90). Comparative experimental results indicate that our proposed OCNN significantly outperforms the traditional method for building extraction.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.