Abstract

Image segmentation has a remarkable influence on the classification accuracy of object-based image analysis. Accordingly, how to raise the performance of remote sensing image segmentation is a key issue. However, this is challenging, primarily because it is difficult to avoid over-segmentation errors (OSE) and under-segmentation errors (USE). To solve this problem, this article presents a new segmentation technique by fusing a region merging method with an unsupervised segmentation evaluation technique called under- and over-segmentation aware (UOA), which is improved by using edge information. Edge information is also used to construct the merging criterion of the proposed approach. To validate the new segmentation scheme, five scenes of high resolution images acquired by Gaofen-2 and Ziyuan-3 multispectral sensors are chosen for the experiment. Quantitative evaluation metrics are employed in the experiment. Results indicate that the proposed algorithm obtains the lowest total error (TE) values for all test images (0.3791, 0.1434, 0.7601, 0.7569, 0.3169 for the first, second, third, fourth, fifth image, respectively; these values are averagely 0.1139 lower than the counterparts of the other methods), as compared to six state-of-the-art region merging-based segmentation approaches, including hybrid region merging, hierarchical segmentation, scale-variable region merging, size-constrained region merging with edge penalty, region merging guided by priority, and region merging combined with the original UOA. Moreover, the performance of the proposed method is better for artificial-object-dominant scenes than the ones mainly covering natural geo-objects.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.