Knowledge base completion is the task which automatically infers missing facts by reasoning about the information already present in the knowledge base. A knowledge base is a collection of relational facts, often represented in the form of "subject", "relation", "object"-triples.
We demonstrate the effectiveness of R-GCNs as a stand-alone model for entity classification.
Ranked #1 on Node Classification on AIFB
The recent proliferation of knowledge graphs (KGs) coupled with incomplete or partial information, in the form of missing relations (links) between entities, has fueled a lot of research on knowledge base completion (also known as relation prediction).
Ranked #2 on Link Prediction on FB15k-237
This paper tackles the problem of endogenous link prediction for Knowledge Base completion.
Knowledge bases (KBs) of real-world facts about entities and their relationships are useful resources for a variety of natural language processing tasks.
This framework is independent of the concrete form of generator and discriminator, and therefore can utilize a wide variety of knowledge graph embedding models as its building blocks.
Ranked #11 on Link Prediction on WN18
We present KBLRN, a framework for end-to-end learning of knowledge base representations from latent, relational, and numerical features.
This 3-column matrix is then fed to a convolution layer where multiple filters are operated on the matrix to generate different feature maps.
Ranked #19 on Link Prediction on WN18RR
The recent graph convolutional network (GCN) provides another way of learning graph node embedding by successfully utilizing graph connectivity structure.
Ranked #11 on Link Prediction on FB15k-237
Knowledge bases of real-world facts about entities and their relationships are useful resources for a variety of natural language processing tasks.