LUKE: Deep Contextualized Entity Representations with Entity-aware Self-attention

Entity representations are useful in natural language tasks involving entities. In this paper, we propose new pretrained contextualized representations of words and entities based on the bidirectional transformer... (read more)

PDF Abstract EMNLP 2020 PDF EMNLP 2020 Abstract

Results from the Paper


TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK USES EXTRA
TRAINING DATA
RESULT BENCHMARK
Named Entity Recognition CoNLL 2003 (English) LUKE F1 94.3 # 1
Entity Typing Open Entity LUKE F1 78.2 # 1
Question Answering ReCoRD LUKE F1 91.2 # 2
Question Answering SQuAD1.1 LUKE EM 90.2 # 1
F1 95.4 # 1
Question Answering SQuAD1.1 dev LUKE EM 89.8 # 2
F1 95 # 4
Question Answering TACRED LUKE Relation F1 72.7 # 1
Relation Extraction TACRED LUKE F1 72.7 # 2

Methods used in the Paper