Rethinking Kernel Methods for Node Representation Learning on Graphs

Graph kernels are kernel methods measuring graph similarity and serve as a standard tool for graph classification. However, the use of kernel methods for node classification, which is a related problem to graph representation learning, is still ill-posed and the state-of-the-art methods are heavily based on heuristics. Here, we present a novel theoretical kernel-based framework for node classification that can bridge the gap between these two representation learning problems on graphs. Our approach is motivated by graph kernel methodology but extended to learn the node representations capturing the structural information in a graph. We theoretically show that our formulation is as powerful as any positive semidefinite kernels. To efficiently learn the kernel, we propose a novel mechanism for node feature aggregation and a data-driven similarity metric employed during the training phase. More importantly, our framework is flexible and complementary to other graph-based deep learning models, e.g., Graph Convolutional Networks (GCNs). We empirically evaluate our approach on a number of standard node classification benchmarks, and demonstrate that our model sets the new state of the art.

PDF Abstract NeurIPS 2019 PDF NeurIPS 2019 Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Link Prediction Citeseer Node Feature Agg + Similarity Metric AUC 90.9% # 11
AP 91.8% # 10
Link Prediction Cora BANE AUC 93.50% # 8
AP 93.2% # 10
Link Prediction Pubmed Node Feature Agg + Similarity Metric AUC 94.5% # 9
AP 94.2% # 9

Methods