Abstract

Background: Measuring the information and removal of uncertainty are the essential nature of human thinking and many world objectives. Information is well used and beneficial if it is free from uncertainty and fuzziness. Shannon was the primitive who coined the term entropy for measure of uncertainty. He also gave an expression of entropy based on probability distribution. Zadeh used the idea of Shannon to develop the concept of fuzzy sets. Later on, Atanassov generalized the concept of fuzzy set and developed intuitionistic fuzzy sets. Purpose: Sometimes we do not have complete information about fuzzy set or intuitionistic fuzzy sets. Some partial information is known about them i.e either only few values of membership function <img src=image/13417006_01.gif> or non membership function <img src=image/13417006_02.gif> are known or a relationship between them is known or some inequalities governing these parameters are known. Kapur has measured the partial information given by a fuzzy set. In this paper, we have attempted to quantify partial information given by intuitionistic fuzzy sets by considering all the cases. Methodologies: We analyze some well-known definitions and axioms used in the field of fuzzy theory. Principal Results: We have devised methods to measure the incomplete information given about intuitionistic fuzzy sets. Major Conclusions: By devising the methods of measuring partial information about IFS, we can use this information to get an idea about the given set and use this information wisely to make a good decision.

Highlights

  • Information theory was developed by Shannon [1] in1948

  • We study some well-known definitions and concepts related to fuzzy sets and intuitionistic fuzzy sets given by various researchers

  • If we know something about an Intuitionistic Fuzzy sets (IFS) like values of some of the μμAA ‘s and ννAA’s or some relationships governing them, that information is known as partial information about the IFS

Read more

Summary

Introduction

Shannon [1] was the first to utilize the term entropy for measuring the information. The information contained in this experiment is given by H(P) = − ∑ni=1 pi ln(pi) [1] which is well known as Shannon’s entropy. Shannon was the primitive who coined the term entropy for measure of uncertainty. He gave an expression of entropy based on probability distribution. Kapur has measured the partial information given by a fuzzy set. We have attempted to quantify partial information given by intuitionistic fuzzy sets by considering all the cases. Principal Results: We have devised methods to measure the incomplete information given about intuitionistic fuzzy sets. Major Conclusions: By devising the methods of measuring partial information about IFS, we can use this information to get an idea about the given set and use this information wisely to make a good decision

Methods
Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.