Abstract

Applying the concept of entropy of deterministic functions to the (cumulative) probability distribution function of random variables yields the (Shannon) entropy of order c. The latter can also be obtained as a consequence of a set of preliminary desiderata or prior axioms. These results, together with the fact that entropy can be thought of as the maximum value of conditional entropy, allow us to suggest an approach to the entropy of (Aor B), which is different from the usual information theoretic one. As an application one derives new composition laws for fuzzy sets, which are softer than the min-max rules.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call