Abstract

Riesen and Bunke recently proposed a novel dissimilarity based approach for embedding graphs into a vector space. One drawback of their approach is the computational cost graph edit operations required to compute the dissimilarity for graphs. In this paper we explore whether the Jensen-Shannon divergence can be used as a means of computing a fast similarity measure between a pair of graphs. We commence by computing the Shannon entropy of a graph associated with a steady state random walk. We establish a family of prototype graphs by using an information theoretic approach to construct generative graph prototypes. With the required graph entropies and a family of prototype graphs to hand, the Jensen-Shannon divergence between a sample graph and a prototype graph can be computed. It is defined as the Jensen-Shannon between the pair of separate graphs and a composite structure formed by the pair of graphs. The required entropies of the graphs can be efficiently computed, the proposed graph embedding using the Jensen-Shannon divergence avoids the burdensome graph edit operation. We explore our approach on several graph datasets abstracted from computer vision and bioinformatics databases.KeywordsShannon EntropyEdit DistanceEdit OperationSample GraphVectorial DescriptionThese keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.