Abstract

In many applications, such as image retrieval and change detection, we need to assess the similarity of two statistical models. As a distance measure between two probability density functions, Kullback-Leibler divergence is widely used for comparing two statistical models. Unfortunately, for some models such as Gaussian Mixture Model (GMM), Kullback-Leibler divergence has no analytically tractable formula. We have to resort to approximation methods. In this paper, we compare seven methods, namely Monte Carlo method, matched bond approximation, product of Gaussian, variation-al method, unscented transformation, Gaussian approximation, and min-Gaussian approximation, for approximating the Kullback-Leibler divergence between two Gaussian mixture models for satellite image retrieval. Two image retrieval experiments based on two publicly available datasets have been performed. The comparison is carried out in terms of both retrieval performance and computational time.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.