Abstract

The evolution of communities in dynamic (time-varying) network data is a prominent topic of interest. A popular approach to understanding these dynamic networks is to embed the dyadic relations into a latent metric space. While methods for clustering with this approach exist for dynamic networks, they all assume a static community structure. This paper presents a Bayesian nonparametric model for dynamic networks that can model networks with evolving community structures. Our model extends existing latent space approaches by explicitly modeling the additions, deletions, splits, and mergers of groups with a hierarchical Dirichlet process hidden Markov model. Our proposed approach, the hierarchical Dirichlet process latent position cluster model (HDP-LPCM), incorporates transitivity, models both individual and group level aspects of the data, and avoids the computationally expensive selection of the number of groups required by most popular methods. We provide a Markov chain Monte Carlo estimation algorithm and demonstrate its ability to detect evolving community structure in a network of military alliances during the Cold War and a narrative network constructed from the Game of Thrones television series.

Highlights

  • Many naturally occurring networks contain discrete changes in community structure

  • We present our extension to the latent position cluster model that can infer an evolving community structure in dynamic networks

  • We describe a Markov chain Monte Carlo (MCMC) method to sample from the hierarchical Dirichlet process (HDP)-latent position cluster model (LPCM)’s posterior

Read more

Summary

Introduction

Many naturally occurring networks contain discrete changes in community structure. When high school students move across the country for college, old friendship groups often dissolve, which leads to the formation of new collegiate friendship groups. When exposed to external stimuli, regions of the brain activate before becoming dormant By identifying these community-level phase changes, we can gain valuable insight into the rich processes that generate dynamic (longitudinal or time-varying) networks. The Dirichlet process (DP) is a distribution over discrete probability measures F0:. Since draws from a DP are discrete with probability one, the DP cannot be used as a general nonparametric prior over continuous densities. To extend the DP to continuous density estimation, one uses F0 as a mixing measure over some parametric class of distributions fθ. This construction is known as the DP mixture model. Note that we adopt the convention that if a K-dimensional vector v belongs to the K simplex, i.e.,

Objectives
Methods
Findings
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call