Topology-Informed Graph Transformer

3 Feb 2024  ·  Yun Young Choi, Sun Woo Park, Minho Lee, Youngho Woo ·

Transformers have revolutionized performance in Natural Language Processing and Vision, paving the way for their integration with Graph Neural Networks (GNNs). One key challenge in enhancing graph transformers is strengthening the discriminative power of distinguishing isomorphisms of graphs, which plays a crucial role in boosting their predictive performances. To address this challenge, we introduce 'Topology-Informed Graph Transformer (TIGT)', a novel transformer enhancing both discriminative power in detecting graph isomorphisms and the overall performance of Graph Transformers. TIGT consists of four components: A topological positional embedding layer using non-isomorphic universal covers based on cyclic subgraphs of graphs to ensure unique graph representation: A dual-path message-passing layer to explicitly encode topological characteristics throughout the encoder layers: A global attention mechanism: And a graph information layer to recalibrate channel-wise graph features for better feature representation. TIGT outperforms previous Graph Transformers in classifying synthetic dataset aimed at distinguishing isomorphism classes of graphs. Additionally, mathematical analysis and empirical evaluations highlight our model's competitive edge over state-of-the-art Graph Transformers across various benchmark datasets.

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Graph Classification CIFAR10 100k TIGT Accuracy (%) 73.955 # 3
Node Classification CLUSTER TIGT Accuracy 78.033 # 5
Graph Classification MNIST TIGT Accuracy 98.230±0.133 # 2
Node Classification PATTERN TIGT Accuracy 86.680 # 6
Graph Regression PCQM4Mv2-LSC TIGT Validation MAE 0.0826 # 6
Graph Classification Peptides-func TIGT AP 0.6679 # 10
Graph Regression Peptides-struct TIGT MAE 0.2485 # 10
Graph Regression ZINC TIGT MAE 0.057 # 1
Graph Regression ZINC-full TIGT MAE 0.014 # 1
Test MAE 0.014 # 1

Methods