Abstract

. In this work, we consider Fisher information and Bayes-Fisher information measures for the mixing parameter vector of a finite mixture density function and develop some associated results for this model. We provide several interesting connections between these measures and some known informational measures, such as chi-square divergence, Shannon entropy, Kullback-Leibler, Jeffreys, and Jensen-Shannon divergences. Finally, to demonstrate the usefulness of the Bayes-Fisher information measure, we apply it to a real example in image processing and present some numerical results. Our findings in this regard show that this information measure is an effective criteria for quantifying the similarity between two images.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call