Online graph nets

29 Sep 2021  ·  Hojin Kang, Jou-Hui Ho, Diego Mesquita, Jorge Pérez, Amauri H Souza ·

Temporal graph neural networks (T-GNNs) sequentially update node states and use temporal message passing to predict events in continuous-time dynamic graphs. While node states rest in the memory, the message-passing operations must be computed on-demand for each prediction. In practice, these operations are the computational bottleneck of state-of-the-art T-GNNs as they require topologically exploring large temporal graphs. To circumvent this caveat, we propose Online Graph Nets (OGNs). To avoid temporal message passing, OGN maintains a summary of the temporal neighbors of each node in a latent variable and updates it as events unroll, in an online fashion. At prediction time, OGN simply combines node states and their latents to obtain node-level representations. Consequently, the memory cost of OGN is constant with respect to the number of previous events. Remarkably, OGN outperforms most existing T-GNNs on temporal link prediction benchmarks while running orders of magnitude faster. For instance, OGN performs similarly to the best-known T-GNN on Reddit, with a $374\times$ speedup. Also, since OGNs do not explore temporal graphs at prediction time, they are well-suited for on-device predictions (e.g., on mobile phones).

PDF Abstract
No code implementations yet. Submit your code now

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here