Abstract

Heavy-tailed networks, which have degree distributions characterised by slower than exponentially bounded tails, are common in many different situations. Some interesting cases, where heavy tails are characterised by inverse powers λ in the range 1<λ<2, arise for associative knowledge networks, and semantic and linguistic networks. In these cases, the differences between the networks are often delicate, calling for robust methods to characterise the differences. Here, we introduce a method for comparing networks using a density matrix based on q-generalised adjacency matrix kernels. It is shown that comparison of networks can then be performed using the q-generalised Kullback–Leibler divergence. In addition, the q-generalised divergence can be interpreted as a q-generalised free energy, which enables the thermodynamic-like macroscopic description of the heavy-tailed networks. The viability of the q-generalised adjacency kernels and the thermodynamic-like description in characterisation of complex networks is demonstrated using a simulated set of networks, which are modular and heavy-tailed with a degree distribution of inverse power law in the range 1<λ<2.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.