Learning Graph Representation by Aggregating Subgraphs via Mutual Information Maximization

24 Mar 2021  ·  Chenguang Wang, Ziwen Liu ·

In this paper, we introduce a self-supervised learning method to enhance the graph-level representations with the help of a set of subgraphs. For this purpose, we propose a universal framework to generate subgraphs in an auto-regressive way and then using these subgraphs to guide the learning of graph representation by Graph Neural Networks. Under this framework, we can get a comprehensive understanding of the graph structure in a learnable way. And to fully capture enough information of original graphs, we design three information aggregators: \textbf{attribute-conv}, \textbf{layer-conv} and \textbf{subgraph-conv} to gather information from different aspects. And to achieve efficient and effective contrastive learning, a Head-Tail contrastive construction is proposed to provide abundant negative samples. Under all proposed components which can be generalized to any Graph Neural Networks, in the unsupervised case, we achieve new state-of-the-art results in several benchmarks. We also evaluate our model on semi-supervised learning tasks and make a fair comparison to state-of-the-art semi-supervised methods.

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here