The goal of Graph Representation Learning is to construct a set of features (‘embeddings’) representing the structure of the graph and the data thereon. We can distinguish among Node-wise embeddings, representing each node of the graph, Edge-wise embeddings, representing each edge in the graph, and Graph-wise embeddings representing the graph as a whole.
We introduce PyTorch Geometric, a library for deep learning on irregularly structured input data such as graphs, point clouds and manifolds, built upon PyTorch.
Ranked #2 on Graph Classification on REDDIT-B
In this survey, we provide a comprehensive review on knowledge graph covering overall research topics about 1) knowledge graph representation learning, 2) knowledge acquisition and completion, 3) temporal knowledge graph, and 4) knowledge-aware applications, and summarize recent breakthroughs and perspective directions to facilitate future research.
The goal of graph representation learning is to embed each vertex in a graph into a low-dimensional vector space.
Ranked #1 on Node Classification on Wikipedia
With the learned importance from both node-level and semantic-level attention, the importance of node and meta-path can be fully considered.
Ranked #1 on Heterogeneous Node Classification on DBLP (PACT) 14k
Recently, graph neural networks (GNNs) have revolutionized the field of graph representation learning through effectively learned node embeddings, and achieved state-of-the-art results in tasks such as node classification and link prediction.
Ranked #1 on Graph Classification on REDDIT-MULTI-12K
Capturing such evolution is key to predicting the properties of unseen networks.
We examine two fundamental tasks associated with graph representation learning: link prediction and node classification.
Ranked #1 on Link Prediction on Citeseer (Accuracy metric)
We examine two fundamental tasks associated with graph representation learning: link prediction and semi-supervised node classification.
Ranked #23 on Node Classification on Pubmed
However, the representational power of hyperbolic geometry is not yet on par with Euclidean geometry, mostly because of the absence of corresponding hyperbolic neural network layers.