Abstract

During the last six decades, the information theory has attracted the researchers from worldwide and its literature is growing leaps and bounds. Some of its terminologies even have become part of our daily language. Every probability distribution has some uncertainty associated with it. The concept of ‘entropy’ is introduced here to provide a quantitative measure of this uncertainty. Different approaches for measure of entropy and its development has been made, viz: 1. An axiomatic approach, 2. Measure of entropy via measure of inaccuracy and directed divergence and 3. Information measures and coding theorem. A hypothetical data of agricultural, fisheries and forestry sectors, in each of nine years were framed. All inputs bought to fisheries and forestry sectors were supplied by other firms of the same sector. It was worked out that the smaller the distance of probability distribution P from Q, the greater will be the uncertainty and greater the entropy. This is always positive and vanishes if and only if P = Q. Now from the Shannon entropy

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.