Abstract

The generative semantic nature of probabilistic topic models has recently shown encouraging results within the remote sensing image fusion field when conducting land cover categorization. However, standard topic models have not yet been adapted to the inherent complexity of remotely sensed data, which eventually may limit their resulting performance. In this scenario, this paper presents a new topic-based image fusion framework, specially designed to fuse synthetic aperture radar (SAR) and multispectral imaging (MSI) data for unsupervised land cover categorization tasks. Specifically, we initially propose a hierarchical multi-modal probabilistic latent semantic analysis (HMpLSA) model that takes advantage of two different vocabulary modalities, as well as two different levels of topics, in order to effectively uncover intersensor semantic patterns. Then, we define an SAR and MSI data fusion framework based on HMpLSA in order to perform unsupervised land cover categorization. Our experiments, conducted using three different SAR and MSI data sets, reveal that the proposed approach is able to provide competitive advantages with respect to standard clustering methods and topic models, as well as several multimodal topic model variants available in the literature.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call