Abstract

Understanding the structure and the dynamics of networks is of paramount importance for many scientific fields that rely on network science. Complex network theory provides a variety of features that help in the evaluation of network behavior. However, such analysis can be confusing and misleading as there are many intrinsic properties for each network metric. Alternatively, Information Theory methods have gained the spotlight because of their ability to create a quantitative and robust characterization of such networks. In this work, we use two Information Theory quantifiers, namely Network Entropy and Network Fisher Information Measure, to analyzing those networks. Our approach detects non-trivial characteristics of complex networks such as the transition present in the Watts-Strogatz model from k-ring to random graphs; the phase transition from a disconnected to an almost surely connected network when we increase the linking probability of Erdős-Rényi model; distinct phases of scale-free networks when considering a non-linear preferential attachment, fitness, and aging features alongside the configuration model with a pure power-law degree distribution. Finally, we analyze the numerical results for real networks, contrasting our findings with traditional complex network methods. In conclusion, we present an efficient method that ignites the debate on network characterization.

Highlights

  • Understanding the structure and the dynamics of networks is of paramount importance for many scientific fields that rely on network science

  • Understanding how networks arrange their connections, and how the information flows through their nodes, is a breakthrough for many scientific fields that rely on network science to assess all kinds of phenomena

  • We analyze the behavior of Information Theory quantifiers when applied to Random (RN), Small World (SWN), and Scale-Free networks (SFN)

Read more

Summary

Introduction

Understanding the structure and the dynamics of networks is of paramount importance for many scientific fields that rely on network science. Information Theory methods gained the spotlight because of their ability to create a more quantitative and robust characterization of complex networks, as an alternative to traditional methods. Standard quantifiers such as Shannon Entropy and Statistical Complexity were adapted to network analysis, providing a different perspective when evaluating networks[1]. To evaluate the Statistical Complexity, we use the entropy as a measure of the information content, and the disequilibrium is expressed by the divergence between the current system state and an appropriate reference state The calculation of these quantifiers requires the use of a proper probability distribution that represents the system under study. The Fisher Information Measure ( ) is a local measure as it is based upon the gradient of the underlying distribution, being, significantly sensitive to even tiny localized perturbations

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call