Learning Dynamic Graph Embeddings with Neural Controlled Differential Equations.

  • Abstract
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon
Take notes icon Take Notes

This paper focuses on representation learning for dynamic graphs with temporal interactions. A fundamental issue is that both the graph structure and the nodes own their own dynamics, and their blending induces intractable complexity in the temporal evolution over graphs. Drawing inspiration from the recent progress of physical dynamic models in deep neural networks, we proposeGraph Neural Controlled Differential Equations (GN-CDEs), a continuous-time framework that jointly models node embeddings and structural dynamics by incorporating a graph enhanced neural network vector field with a time-varying graph path as the control signal. Our framework exhibits several desirable characteristics, including the ability to express dynamics on evolving graphs without piecewise integration, the capability to calibrate trajectories with subsequent data, and robustness to missing observations. Empirical evaluation on a range of dynamic graph representation learning tasks demonstrates the effectiveness of our proposed approach in capturing the complex dynamics of dynamic graphs.

Similar Papers
  • PDF Download Icon
  • Research Article
  • Cite Count Icon 8
  • 10.1186/s40537-024-00918-5
Enhancing academic performance prediction with temporal graph networks for massive open online courses
  • Apr 13, 2024
  • Journal of Big Data
  • Qionghao Huang + 1 more

Educational big data significantly impacts education, and Massive Open Online Courses (MOOCs), a crucial learning approach, have evolved to be more intelligent with these technologies. Deep neural networks have significantly advanced the crucial task within MOOCs, predicting student academic performance. However, most deep learning-based methods usually ignore the temporal information and interaction behaviors during the learning activities, which can effectively enhance the model’s predictive accuracy. To tackle this, we formulate the learning processes of e-learning students as dynamic temporal graphs to encode the temporal information and interaction behaviors during their studying. We propose a novel academic performance prediction model (APP-TGN) based on temporal graph neural networks. Specifically, in APP-TGN, a dynamic graph is constructed from online learning activity logs. A temporal graph network with low-high filters learns potential academic performance variations encoded in dynamic graphs. Furthermore, a global sampling module is developed to mitigate the problem of false correlations in deep learning-based models. Finally, multi-head attention is utilized for predicting academic outcomes. Extensive experiments are conducted on a well-known public dataset. The experimental results indicate that APP-TGN significantly surpasses existing methods and demonstrates excellent potential in automated feedback and personalized learning.

  • Research Article
  • Cite Count Icon 24
  • 10.1287/ijoc.2022.1172
Inductive Representation Learning on Dynamic Stock Co-Movement Graphs for Stock Predictions
  • Mar 1, 2022
  • INFORMS Journal on Computing
  • Hu Tian + 4 more

Co-movement among individual firms’ stock prices can reflect complex interfirm relationships. This paper proposes a novel method to leverage such relationships for stock price predictions by adopting inductive graph representation learning on dynamic stock graphs constructed based on historical stock price co-movement. To learn node representations from such dynamic graphs for better stock predictions, we propose the hybrid-attention dynamic graph neural network, an inductive graph representation learning method. We also extended mini-batch gradient descent to inductive representation learning on dynamic stock graphs so that the model can update parameters over mini-batch stock graphs with higher training efficiency. Extensive experiments on stocks from different markets and trading simulations demonstrate that the proposed method significantly improves stock predictions. The proposed method can have important implications for the management of financial portfolios and investment risk. Summary of Contribution: Accurate predictions of stock prices have important implications for financial decisions. In today’s economy, individual firms are increasingly connected via different types of relationships. As a result, firms’ stock prices often feature synchronous co-movement patterns. This paper represents the first effort to leverage such phenomena to construct dynamic stock graphs for stock predictions. We develop hybrid-attention dynamic graph neural network (HAD-GNN), an inductive graph representation learning framework for dynamic stock graphs to incorporate temporal and graph attention mechanisms. To improve the learning efficiency of HAD-GNN, we also extend the mini-batch gradient descent to inductive representation learning on such dynamic graphs and adopt a t-batch training mechanism (t-BTM). We demonstrate the effectiveness of our new approach via experiments based on real-world data and simulations.

  • Conference Article
  • Cite Count Icon 44
  • 10.1145/3340531.3411946
Continuous-Time Dynamic Graph Learning via Neural Interaction Processes
  • Oct 19, 2020
  • Xiaofu Chang + 6 more

Dynamic graphs such as the user-item interactions graphs and financial transaction networks are ubiquitous nowadays. While numerous representation learning methods for static graphs have been proposed, the study of dynamic graphs is still in its infancy. A main challenge of modeling dynamic graphs is how to effectively encode temporal and structural information into nonlinear and compact dynamic embeddings. To achieve this, we propose a principled graph-neural-based approach to learn continuous-time dynamic embeddings. We first define a temporal dependency interaction graph(TDIG) that is induced from sequences of interaction data. Based on the topology of this TDIG, we develop a dynamic message passing neural network named TDIG-MPNN, which can capture the fine-grained global and local information on TDIG. In addition, to enhance the quality of continuous-time dynamic embeddings, a novel selection mechanism comprised of two successive steps, i.e., co-attention and gating, is applied before the above TDIG-MPNN layer to adjust the importance of the nodes by considering high-order correlation between interactive nodes' k-depth neighbors on TDIG. Finally, we cast our learning problem in the framework of temporal point processes (TPPs) where we use TDIG-MPNN to design a neural intensity function for the dynamic interaction processes. Our model achieves superior performance over alternatives on temporal interaction prediction (including tranductive and inductive tasks) on multiple datasets.

  • Research Article
  • Cite Count Icon 2
  • 10.1016/j.jnlssr.2024.05.005
DyHDGE: Dynamic heterogeneous transaction graph embedding for safety-centric fraud detection in financial scenarios
  • Jul 22, 2024
  • Journal of Safety Science and Resilience
  • Xinzhi Wang + 3 more

DyHDGE: Dynamic heterogeneous transaction graph embedding for safety-centric fraud detection in financial scenarios

  • Research Article
  • 10.1609/aaai.v36i11.21682
Learning to Evolve on Dynamic Graphs (Student Abstract)
  • Jun 28, 2022
  • Proceedings of the AAAI Conference on Artificial Intelligence
  • Xintao Xiang + 2 more

Representation learning in dynamic graphs is a challenging problem because the topology of graph and node features vary at different time. This requires the model to be able to effectively capture both graph topology information and temporal information. Most existing works are built on recurrent neural networks (RNNs), which are used to exact temporal information of dynamic graphs, and thus they inherit the same drawbacks of RNNs. In this paper, we propose Learning to Evolve on Dynamic Graphs (LEDG) - a novel algorithm that jointly learns graph information and time information. Specifically, our approach utilizes gradient-based meta-learning to learn updating strategies that have better generalization ability than RNN on snapshots. It is model-agnostic and thus can train any message passing based graph neural network (GNN) on dynamic graphs. To enhance the representation power, we disentangle the embeddings into time embeddings and graph intrinsic embeddings. We conduct experiments on various datasets and down-stream tasks, and the experimental results validate the effectiveness of our method.

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 1
  • 10.4314/jasem.v27i11.35
Application of Deep Neural Network-Artificial Neural Network Model for Prediction Of Dew Point Pressure in Gas Condensate Reservoirs from Field-X in the Niger Delta Region Nigeria
  • Nov 28, 2023
  • Journal of Applied Sciences and Environmental Management
  • P U Abeshi + 4 more

Reservoirs of natural gas and gas condensate have been proposed as a potential for providing affordable and cleaner energy sources to the global population growth and industrialization expansion simultaneously. This work evaluates reservoir simulation for production optimization using Deep Neural network - artificial neural network (DNN-ANN) model to predict the dew point pressure in gas condensate reservoirs from Field-X in the Niger Delta Region of Nigeria. The dew-point pressure (DPP) of gas condensate reservoirs was estimated as a function of gas composition, reservoir temperature, molecular weight and specific gravity of heptane plus percentage. Results obtained show that the mean relative error (MRE) and R-squared (R2) are 0.99965 and 3.35%, respectively, indicating that the model is excellent in predicting DPP values. The Deep Neural Network - Artificial Neural Network (DNN-ANN) model is also evaluated in comparison to earlier models created by previous authors. It was recommended that the DNN - ANN model developed in this study could be applied to reservoir simulation and modeling well performance analysis, reservoir engineering problems and production optimization.

  • Research Article
  • Cite Count Icon 5
  • 10.1016/j.neunet.2023.11.060
Black-box attacks on dynamic graphs via adversarial topology perturbations
  • Dec 1, 2023
  • Neural Networks
  • Haicheng Tao + 5 more

Black-box attacks on dynamic graphs via adversarial topology perturbations

  • Research Article
  • Cite Count Icon 17
  • 10.1016/j.knosys.2021.107453
FILDNE: A Framework for Incremental Learning of Dynamic Networks Embeddings
  • Aug 30, 2021
  • Knowledge-Based Systems
  • Piotr Bielak + 4 more

FILDNE: A Framework for Incremental Learning of Dynamic Networks Embeddings

  • Research Article
  • Cite Count Icon 5
  • 10.1016/j.knosys.2024.111952
Backbone-based Dynamic Spatio-Temporal Graph Neural Network for epidemic forecasting
  • May 16, 2024
  • Knowledge-Based Systems
  • Junkai Mao + 3 more

Backbone-based Dynamic Spatio-Temporal Graph Neural Network for epidemic forecasting

  • Conference Article
  • Cite Count Icon 3
  • 10.1109/icassp49357.2023.10094834
Dynamic Fair Node Representation Learning
  • Jun 4, 2023
  • Oyku Deniz Kose + 1 more

Many real-world networks such as social networks, traffic networks vary over time, which can be modeled as dynamic graphs. Despite the significant number of systems that can facilitate from the algorithmic tools over dynamic graphs, dynamic graph representation learning is an under-explored research area. Furthermore, while the fairness of algorithms is essential for their deployment in real-world systems, this issue has never been considered in the context of dynamic graphs to the best of our knowledge. Motivated by this, the present study proposes an efficient online node representation learning framework over dynamic graphs that can also mitigate bias. Specifically, the proposed technique combines different observations (graph structure and nodal attributes) of the same source (attributed graph) in a complementary way while also reducing the intrinsic bias in the learned representations. Experimental results on dynamic graphs show that the proposed online strategy can improve the group fairness measures for node classification together with comparable/better utility to the baselines.

  • Research Article
  • Cite Count Icon 26
  • 10.1016/j.jestch.2018.08.010
Deep neural network model for group activity recognition using contextual relationship
  • Sep 7, 2018
  • Engineering Science and Technology, an International Journal
  • S.A Vahora + 1 more

Deep neural network model for group activity recognition using contextual relationship

  • Conference Article
  • Cite Count Icon 62
  • 10.1145/3308560.3316581
Learning to Represent the Evolution of Dynamic Graphs with Recurrent Models
  • May 13, 2019
  • Aynaz Taheri + 2 more

Graph representation learning for static graphs is a well studied topic. Recently, a few studies have focused on learning temporal information in addition to the topology of a graph. Most of these studies have relied on learning to represent nodes and substructures in dynamic graphs. However, the representation learning problem for entire graphs in a dynamic context is yet to be addressed. In this paper, we propose an unsupervised representation learning architecture for dynamic graphs, designed to learn both the topological and temporal features of the graphs that evolve over time. The approach consists of a sequence-to-sequence encoder-decoder model embedded with gated graph neural networks (GGNNs) and long short-term memory networks (LSTMs). The GGNN is able to learn the topology of the graph at each time step, while LSTMs are leveraged to propagate the temporal information among the time steps. Moreover, an encoder learns the temporal dynamics of an evolving graph and a decoder reconstructs the dynamics over the same period of time using the encoded representation provided by the encoder. We demonstrate that our approach is capable of learning the representation of a dynamic graph through time by applying the embeddings to dynamic graph classification using a real world dataset of animal behaviour.

  • Research Article
  • Cite Count Icon 19
  • 10.1016/j.neucom.2022.01.064
A unified structure learning framework for graph attention networks
  • Jan 22, 2022
  • Neurocomputing
  • Jinliang Yuan + 5 more

A unified structure learning framework for graph attention networks

  • Book Chapter
  • Cite Count Icon 1
  • 10.1007/978-3-031-20865-2_32
Dynamic-GTN: Learning an Node Efficient Embedding in Dynamic Graph with Transformer
  • Jan 1, 2022
  • Thi-Linh Hoang + 1 more

Graph Transformer Networks (GTN) use an attention mechanism to learn the node representation in a static graph and achieves state-of-the-art results on several graph learning tasks. However, due to the computation complexity of the attention operation, GTNs are not applicable to dynamic graphs. In this paper, we propose the Dynamic-GTN model which is designed to learn the node embedding in a continous-time dynamic graph. The Dynamic-GTN extends the attention mechanism in a standard GTN to include temporal information of recent node interactions. Based on temporal patterns interaction between nodes, the Dynamic-GTN employs an node sampling step to reduce the number of attention operations in the dynamic graph. We evaluate our model on three benchmark datasets for learning node embedding in dynamic graphs. The results show that the Dynamic-GTN has better accuracy than the state-of-the-art of Graph Neural Networks on both transductive and inductive graph learning tasks.KeywordsGraph Transformer NetworkDynamic graphNode sampling

  • Research Article
  • Cite Count Icon 1
  • 10.1016/j.eswa.2024.124201
A novel robust black-box fingerprinting scheme for deep classification neural networks
  • May 14, 2024
  • Expert Systems With Applications
  • Mouke Mo + 5 more

A novel robust black-box fingerprinting scheme for deep classification neural networks

More from: IEEE transactions on pattern analysis and machine intelligence
  • New
  • Research Article
  • 10.1109/tpami.2025.3630635
Towards Visual Grounding: A Survey.
  • Nov 7, 2025
  • IEEE transactions on pattern analysis and machine intelligence
  • Linhui Xiao + 4 more

  • New
  • Research Article
  • 10.1109/tpami.2025.3630339
DELTA: Deep Low-Rank Tensor Representation for Multi-Dimensional Data Recovery.
  • Nov 7, 2025
  • IEEE transactions on pattern analysis and machine intelligence
  • Guo-Wei Yang + 4 more

  • New
  • Research Article
  • 10.1109/tpami.2025.3630577
Variational Bayesian Semi-supervised Keyword Extraction.
  • Nov 7, 2025
  • IEEE transactions on pattern analysis and machine intelligence
  • Yaofang Hu + 3 more

  • New
  • Research Article
  • 10.1109/tpami.2025.3630505
Large-scale Logo Detection.
  • Nov 7, 2025
  • IEEE transactions on pattern analysis and machine intelligence
  • Sujuan Hou + 6 more

  • New
  • Research Article
  • 10.1109/tpami.2025.3630673
A Survey of Graph Neural Networks in Real World: Imbalance, Noise, Privacy and OOD Challenges.
  • Nov 7, 2025
  • IEEE transactions on pattern analysis and machine intelligence
  • Wei Ju + 12 more

  • New
  • Research Article
  • 10.1109/tpami.2025.3630317
Large-Scale Omnidirectional Person Positioning.
  • Nov 7, 2025
  • IEEE transactions on pattern analysis and machine intelligence
  • Lu Yang + 5 more

  • New
  • Research Article
  • 10.1109/tpami.2025.3630242
SPAN: Learning Similarity between Scene Graphs and Images with Transformers.
  • Nov 7, 2025
  • IEEE transactions on pattern analysis and machine intelligence
  • Yuren Cong + 3 more

  • New
  • Research Article
  • 10.1109/tpami.2025.3630185
Sparse-PGD: A Unified Framework for Sparse Adversarial Perturbations Generation.
  • Nov 7, 2025
  • IEEE transactions on pattern analysis and machine intelligence
  • Xuyang Zhong + 1 more

  • New
  • Research Article
  • 10.1109/tpami.2025.3630605
Graph Quality Matters on Revealing the Semantics behind the Data in Physical World.
  • Nov 7, 2025
  • IEEE transactions on pattern analysis and machine intelligence
  • Jielong Yan + 3 more

  • New
  • Research Article
  • 10.1109/tpami.2025.3630209
Dynamic Bit-Wise Semantic Transformer Hashing for Multi-Modal Retrieval.
  • Nov 7, 2025
  • IEEE transactions on pattern analysis and machine intelligence
  • Wentao Tan + 6 more

Save Icon
Up Arrow
Open/Close
  • Ask R Discovery Star icon
  • Chat PDF Star icon

AI summaries and top papers from 250M+ research sources.

Search IconWhat is the difference between bacteria and viruses?
Open In New Tab Icon
Search IconWhat is the function of the immune system?
Open In New Tab Icon
Search IconCan diabetes be passed down from one generation to the next?
Open In New Tab Icon