Abstract

Negation in general represents the contradiction and denial of anything. In classical logic, negation is identified as the logical connective that takes truthfulness to falsehood. However, there can be situations in which we have to deal with the possibility of any truth value besides true or false. The probability theory has been effective in modelling such situations. Therefore exploring the concept of probability functions from the negation point of view should be helpful in investigating new horizons related with a random phenomena. The negation of a probability distribution results from a transformation which reallocates the probability of each event equally among the other alternatives in a finite sample space. Proposed by Yager, this transformation possesses the maximum entropy allocation. Since any type of mixing increases the disorder(uncertainty) in a system, intuitively the uncertainty associated with the obtained negation must be greater than the uncertainty associated with the original probability distribution. A closer look at Yager's negation reveals that negation of probabilities are convex combinations of the probabilities in the original distribution and we have utilized this for estimating the unpredictability associated with the negation and the implications have been discussed in detail. Also, we have shown that the concept of negation can be helpful for estimation of symmetry and biasness in probability distributions. The binomial distribution has been taken as an example. An extension in case of Atanassov's intuitionistic fuzzy sets has also been discussed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call