Abstract

As the availability and scope of social networks and relational datasets increase, learning latent structure in complex networks has become an important problem for pattern recognition. To contract compact and flexible representations for weighted networks, a Weighted Infinite Relational Model (WIRM) is proposed to learn from both the presence and weight of links in networks. As a Bayesian nonparametric model based on the Dirichlet process prior, a distinctive feature of WIRM is its ability to learn the latent structure underlying weighted networks without specifying the number of clusters. This is particularly important for structure discovery in complex networks, especially for novel domains where we may have little prior knowledge. We develop a mean-field variational algorithm to efficiently approximate the model's posterior distribution over the infinite latent clusters. Experiments on synthetic data set and real-world data sets demonstrate that WIRM can effectively capture the latent structure underlying the complex weighted networks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call