Abstract

The ability to model the information diffusion process and predict its size is crucial to understanding information propagation mechanism and is useful for many applications such as popularity prediction and fake news detection. Recent research works have attempted to address the problem of information cascade prediction using two basic paradigms: (1) sequential methods, e.g., recurrent neural networks (RNNs), and (2) graph learning techniques to retain the topological information and enable consideration of structural relationships among diffusion participants. However, existing models consider the topological and temporal features separately, falling short of simulating their entanglement in the diffusion process. As a consequence, since the evolving directed acyclic graph (DAG) of information diffusion is intrinsically coupled with both topological and temporal dependencies, there is a loss of cross-domain information. In this paper, we propose a transformer enhanced Hawkes process (Hawkesformer), which links the hierarchical attention mechanism to Hawkes self-exciting point process for information cascade prediction. Specifically, we extend traditional Hawkes process with a topological horizon and efficiently acquire knowledge from continuous-time domain. A two-level attention architecture is used to parameterize the intensity function of Hawkesformer. At the first-level, we disentangle the primary and non-primary paths to simulate the coupled topological and temporal information for capturing the global dependencies between the nodes in a graph. At the second-level, a local pooling attentive module is proposed to embed the cascade evolution rate for modeling short-term outbreaks. Extensive experiments on two real-world datasets demonstrate the significant performance improvements of Hawkesformer over existing state-of-the-art models.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.