Diffusing Graph Attention

1 Mar 2023  ·  Daniel Glickman, Eran Yahav ·

The dominant paradigm for machine learning on graphs uses Message Passing Graph Neural Networks (MP-GNNs), in which node representations are updated by aggregating information in their local neighborhood. Recently, there have been increasingly more attempts to adapt the Transformer architecture to graphs in an effort to solve some known limitations of MP-GNN. A challenging aspect of designing Graph Transformers is integrating the arbitrary graph structure into the architecture. We propose Graph Diffuser (GD) to address this challenge. GD learns to extract structural and positional relationships between distant nodes in the graph, which it then uses to direct the Transformer's attention and node representation. We demonstrate that existing GNNs and Graph Transformers struggle to capture long-range interactions and how Graph Diffuser does so while admitting intuitive visualizations. Experiments on eight benchmarks show Graph Diffuser to be a highly competitive model, outperforming the state-of-the-art in a diverse set of domains.

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Link Prediction PCQM-Contact Graph Diffuser Hits@1 0.1369±0.0012 # 2
Hits@3 0.4053±0.0011 # 2
Hits@10 0.8592±0.0007 # 2
MRR 0.3388±0.0011 # 8
Graph Classification Peptides-func Graph Diffuser AP 0.6651±0.0010 # 11
Graph Regression Peptides-struct Graph Diffuser MAE 0.2461±0.0010 # 5

Methods