|TREND||DATASET||BEST METHOD||PAPER TITLE||PAPER||CODE||COMPARE|
We release an open toolkit for knowledge embedding (OpenKE), which provides a unified framework and various fundamental models to embed knowledge graphs into a continuous low-dimensional space.
In statistical relational learning, knowledge graph completion deals with automatically understanding the structure of large knowledge graphs---labeled directed graphs---and predicting missing relationships---labeled edges.
Ranked #2 on Knowledge Graphs on FB15k
Learning embeddings of entities and relations is an efficient and versatile method to perform machine learning on relational data such as knowledge graphs.
Ranked #5 on Link Prediction on FB15k
We demonstrate the effectiveness of R-GCNs as a stand-alone model for entity classification.
Ranked #1 on Node Classification on AIFB
Neural language representation models such as BERT pre-trained on large-scale corpora can well capture rich semantic patterns from plain text, and be fine-tuned to consistently improve the performance of various NLP tasks.
Ranked #1 on Relation Extraction on FewRel
Given a learned knowledge graph (KG), our approach takes as input semantic embeddings for each node (representing visual category).
In this survey, we provide a comprehensive review on knowledge graph covering overall research topics about 1) knowledge graph representation learning, 2) knowledge acquisition and completion, 3) temporal knowledge graph, and 4) knowledge-aware applications, and summarize recent breakthroughs and perspective directions to facilitate future research.
In this paper, we provide a review of how such statistical models can be "trained" on large knowledge graphs, and then used to predict new facts about the world (which is equivalent to predicting new edges in the graph).