Abstract
Nowadays, heterogeneous event sequence data is an inseparable and extremely important part for our daily life. The extraordinary nature of event sequence data is characterized by its existing complex long-term and short-term temporal dependencies. Most point process models based on a recurrent neural network fail to capture these dependencies and make accurate predictions. The Transformer Hawkes Process(THP) model, utilizes the self-attention mechanism to capture long-term dependencies, which is suitable and effective for the prediction of event sequence data. Graph contrastive learning (GCL) with adaptive reinforcement can enhance data by making the intra-class hidden features of the instances close while keeping the inter-class hidden features scattered away. Inspired by these, we propose the idea of combining the THP with adaptive enhanced GCL. The proposed Hawkes Process via Graph Contrastive Discriminant representation Learning and Transformer capturing long-term dependencies(GCDRLT) is the two-stage pipeline to enhance the capacity of hidden representation both on long-term dependencies and discriminant feature extraction. Experimental results on multiple datasets validate that the graph contrastive learning method can improve the accuracies of the Transformer-Hawkes process model for predicting heterogeneous event sequences.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.