Abstract

We re-take the possibilistic (strictly non-probabilistic) model for information sources and information coding put forward in (Fuzzy Sets and Systems 132–1 (2002) 11–32); the coding-theoretic possibilistic entropy is defined there as the asymptotic rate of compression codes, which are optimal with respect to a possibilistic (not probabilistic) criterion. By proving a uniqueness theorem, in this paper we provide also an axiomatic derivation for such a possibilistic entropy, and so are able to support its use as an adequate measure of non-specificity, or rather of “possibilistic ignorance”, as we shall prefer to say. We compare our possibilistic entropy with two well-known measures of non-specificity: Hartley measure as found in set theory and U-uncertainty as found in possibility theory. The comparison allows us to show that the latter possesses also a coding-theoretic meaning.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.