Abstract
Several information and divergence measures existing in the literature assist in measuring the knowledge contained in sources of information. Studying an information source from both positive and negative aspects will result in more accurate and comprehensive information. In many cases, extracting information through the positive approach could be not an easy task while it may be feasible when dealing with the negative aspect. Negation is a new perspective and direction to quantify the information or knowledge in a given system from the negative approach. In this work, we study some new information measures, such as Fisher information, Fisher information distance, Jensen–Fisher information and Jensen–Shannon entropy measures, based on complementary distributions. We then show that the proposed Jensen–Fisher information measure can be expressed based on Fisher information distance measure. We have further shown that the Jensen–Shannon entropy measure has two representations in terms of Kullback–Leibler divergence and Jensen–extropy measures. Some illustrations related to complementary distribution of Bernoulli and Poisson random variables are then presented. Finally, for illustrative purpose, we have examined a real example on Conway’s game of life and have presented some numerical results in terms of the proposed information measures.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.