Abstract

During the last years, object-based image segmentation (OBIA) has seen a considerable increase in the image segmentation. OBIA is generally based on superpixel methods, in which the clustering-based method plays an increasingly important role. Most clustering methods for generating superpixels suffer from inaccurate classification points with inappropriate cluster centers. To solve the problem, we propose a competitive mechanism-based superpixel generation (CMSuG) method, which both accelerates convergence and promotes robustness for noise sensitivity. Then, image segmentation results will be obtained by a region adjacent graph (RAG)-based merging algorithm after constructing an RAG. However, high segmentation accuracy is customarily accompanied by expensive time-consuming costs. To improve computational efficiency, we address a parallel CMSuG algorithm, the time of which is much less than the CMSuG method. In addition, we present a parallel RAG method to decrease the expensive time-consuming cost in serial RAG construction. By leveraging parallel techniques, the running time of the whole image segmentation method decline with the time complexity from O (N) + O (K2) to O (N/K) or O (K2), in which N is the size of an input image and K is the given number of the superpixel. In the experiments, both nature image and remote sensing image segmentation results demonstrate that our CMSuG method outperforms the state-of-the-art superpixel generation methods, and then performs well for image segmentation in turn. Compared with the serial segmentation method, our parallel techniques gain more than four times acceleration in both remote sensing image dataset and nature image dataset.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.