Abstract

Sub-graph entropy has recently been applied to functional brain network analysis for identifying important brain regions associated with different brain states and for discriminating brain networks of subjects with psychiatric disorders from healthy controls. This letter describes two pertinent properties of sub-graph entropy. It is shown that when a graph is divided into multiple smaller graphs, the summation of their sub-graph entropy is always less than a constant. Additionally, this summation is always greater than the corresponding graph entropy. We also demonstrate that node entropy, a special case of sub-graph entropy, is stable. Experiments using both synthetic data and real world brain network data are carried out to further validate these points. Overall, node entropy has better stability compared to other centrality metrics. Furthermore, our results illustrate that, for human functional brain networks with two different induced states, node entropy has relatively higher change in ranking. Altogether these findings pave the way for real-world applications of sub-graph entropy as a centrality metric in graph signals.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.