Abstract

Information theory, introduced by Shannon in the context of information transfer in communication channels, is used as a foundation for research in many diverse fields. The concept Entropy in terms of information theory is seen as the average amount of information or the rate of information produced when forming a message, element by element. Entropy has found broad application in many research fields and can also be applied in software engineering for quantifying the uncertainty associated with a software code. In this paper, information entropy and its application towards measuring software complexity are explored, along with the formulation of an information entropy based complexity measure that considers logical decision-making, processes, and software statement interaction patterns in control flow graphs mapped from actual software code. To broaden the application of the proposed metric, the execution times of nodes in the control flow graphs are also incorporated. Further, the metric is evaluated against eight different axioms that a software complexity measure should satisfy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call