Abstract
Link prediction in dynamic(temporal) networks refers to predicting future edges by analyzing the available network information. Among the existing temporal link prediction approaches, non-negative matrix factorization(NMF) is a kind of competitive algorithm and has attracted extensive attention. However, traditional NMF-based prediction methods are shallow methods and cannot fully mine the dynamic network, which may lead to a decrease in performance of algorithms. To overcome these shortcomings, inspired by deep Autoencoder, we propose two novel deep Autoencoder-like NMF with graph regularized prediction methods for dynamic networks. By fusing encoder component with deep structure into deep NMF model, our algorithms can sufficiently exploit the complex hierarchical information hidden in dynamic networks. To further extract the abundant information hidden in dynamic networks, graph regularization and PageRank are utilized to exploit the local and global topology information of each snapshot, respectively. By jointly optimizing them in deep Autoencoder-like NMF model, our model is able to preserve the local and global information hidden in dynamic networks, simultaneously. Moreover, an effective alternating iterative method with convergence guarantee is developed for minimizing the established model. Finally, we test our proposed prediction methods on several synthetic and real world datasets to demonstrate that our approaches outperform the state-of-the-art prediction approaches.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.