Abstract

The number of superpixels (i.e., segmentation scale) is crucial for spectral–spatial hyperspectral image (HSI) classification. Existing methods always set the segmentation scale through a manually experimental strategy, which is time-consuming and unsuitable for various complicated practical applications. The information fusion of complementary multiple scales is proven to be more effective than the single scale for HSI classification, but the scale level is still set manually. In this article, we propose a novel adaptive multiscale segmentations (AMSs) method that can automatically provide a set of suitable scales that are adapted to different hyperspectral data. Specifically, based on the assumption that the segmentation scale of HSI is related to the image complexity itself, the texture ratio and the number of land cover classes are used to examine a candidate scale pool. A good scale means that it contains a small spectral difference between pixels within the same superpixel (intrasuperpixel discrimination index) and a large discrepancy between neighboring superpixels (intersuperpixel discrimination index). Thus, an intra–interscale discrimination index is defined and applied to depict the characteristics of the scale. Then, the scale with the best intra–inter discrimination index, which usually has satisfactory performance, is treated as the initially selected scale. The remaining suitable scales are iteratively compared with the selected ones and then added to the target scale pool, until the newly added scale can no longer provide significantly complementary information. Extensive experimental results on three HSI data sets have demonstrated the effectiveness of the proposed AMS when compared with state-of-the-art methods.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.