Link Prediction

819 papers with code • 78 benchmarks • 63 datasets

Link Prediction is a task in graph and network analysis where the goal is to predict missing or future connections between nodes in a network. Given a partially observed network, the goal of link prediction is to infer which links are most likely to be added or missing based on the observed connections and the structure of the network.

( Image credit: Inductive Representation Learning on Large Graphs )

Libraries

Use these libraries to find Link Prediction models and implementations

Most implemented papers

EvolveGCN: Evolving Graph Convolutional Networks for Dynamic Graphs

IBM/EvolveGCN 26 Feb 2019

Existing approaches typically resort to node embeddings and use a recurrent neural network (RNN, broadly speaking) to regulate the embeddings and learn the temporal dynamics.

Knowledge Graph Convolutional Networks for Recommender Systems

hwwang55/KGCN 18 Mar 2019

To alleviate sparsity and cold start problem of collaborative filtering based recommender systems, researchers and engineers usually collect attributes of users and items, and design delicate algorithms to exploit these additional information.

How Attentive are Graph Attention Networks?

tech-srl/how_attentive_are_gats ICLR 2022

Because GATs use a static attention mechanism, there are simple graph problems that GAT cannot express: in a controlled problem, we show that static attention hinders GAT from even fitting the training data.

Structural Deep Network Embedding

shenweichen/GraphEmbedding KDD 2016

Therefore, how to find a method that is able to effectively capture the highly non-linear network structure and preserve the global and local structure is an open yet important problem.

Neural Factorization Machines for Sparse Predictive Analytics

hexiangnan/neural_factorization_machine 16 Aug 2017

However, FM models feature interactions in a linear way, which can be insufficient for capturing the non-linear and complex inherent structure of real-world data.

NSCaching: Simple and Efficient Negative Sampling for Knowledge Graph Embedding

yzhangee/NSCaching 16 Dec 2018

Negative sampling, which samples negative triplets from non-observed ones in the training data, is an important step in KG embedding.

Cluster-GCN: An Efficient Algorithm for Training Deep and Large Graph Convolutional Networks

google-research/google-research KDD 2019

Furthermore, Cluster-GCN allows us to train much deeper GCN without much time and memory overhead, which leads to improved prediction accuracy---using a 5-layer Cluster-GCN, we achieve state-of-the-art test F1 score 99. 36 on the PPI dataset, while the previous best result was 98. 71 by [16].

OGB-LSC: A Large-Scale Challenge for Machine Learning on Graphs

snap-stanford/ogb 17 Mar 2021

Enabling effective and efficient machine learning (ML) over large-scale graph data (e. g., graphs with billions of edges) can have a great impact on both industrial and scientific applications.

GraphGAN: Graph Representation Learning with Generative Adversarial Nets

hwwang55/GraphGAN 22 Nov 2017

The goal of graph representation learning is to embed each vertex in a graph into a low-dimensional vector space.

Learning Heterogeneous Knowledge Base Embeddings for Explainable Recommendation

xiangwang1223/knowledge_graph_attention_network 9 May 2018

Specifically, we propose a knowledge-base representation learning framework to embed heterogeneous entities for recommendation, and based on the embedded knowledge base, a soft matching algorithm is proposed to generate personalized explanations for the recommended items.