Optimization of Graph Neural Networks with Natural Gradient Descent

21 Aug 2020  ·  Mohammad Rasool Izadi, Yihao Fang, Robert Stevenson, Lizhen Lin ·

In this work, we propose to employ information-geometric tools to optimize a graph neural network architecture such as the graph convolutional networks. More specifically, we develop optimization algorithms for the graph-based semi-supervised learning by employing the natural gradient information in the optimization process. This allows us to efficiently exploit the geometry of the underlying statistical model or parameter space for optimization and inference. To the best of our knowledge, this is the first work that has utilized the natural gradient for the optimization of graph neural networks that can be extended to other semi-supervised problems. Efficient computations algorithms are developed and extensive numerical studies are conducted to demonstrate the superior performance of our algorithms over existing algorithms such as ADAM and SGD.

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Node Classification Citeseer SSP Accuracy 80.52 ± 0.14 # 7
Node Classification CiteSeer with Public Split: fixed 20 nodes per class SSP Accuracy 74.28 ± 0.67% # 8
Node Classification Cora SSP Accuracy 90.16% ± 0.59% # 1
Node Classification Cora with Public Split: fixed 20 nodes per class SSP Accuracy 82.84 ± 0.87% # 23
Node Classification Pubmed SSP Accuracy 89.36 ± 0.57 # 11
Node Classification PubMed with Public Split: fixed 20 nodes per class SSP Accuracy 80.06 ± 0.34% # 15

Methods