Edge Contraction Pooling for Graph Neural Networks

27 May 2019  ·  Frederik Diehl ·

Graph Neural Network (GNN) research has concentrated on improving convolutional layers, with little attention paid to developing graph pooling layers. Yet pooling layers can enable GNNs to reason over abstracted groups of nodes instead of single nodes. To close this gap, we propose a graph pooling layer relying on the notion of edge contraction: EdgePool learns a localized and sparse hard pooling transform. We show that EdgePool outperforms alternative pooling methods, can be easily integrated into most GNN models, and improves performance on both node and graph classification.

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Graph Classification PROTEINS EdgePool w GraphSAGE Accuracy 73.5% # 74
Graph Classification PROTEINS EdgePool Accuracy 72.5% # 80

Methods


No methods listed for this paper. Add relevant methods here