Two are Better than One: Joint Entity and Relation Extraction with Table-Sequence Encoders

EMNLP 2020  ·  Jue Wang, Wei Lu ·

Named entity recognition and relation extraction are two important fundamental problems. Joint learning algorithms have been proposed to solve both tasks simultaneously, and many of them cast the joint task as a table-filling problem. However, they typically focused on learning a single encoder (usually learning representation in the form of a table) to capture information required for both tasks within the same space. We argue that it can be beneficial to design two distinct encoders to capture such two different types of information in the learning process. In this work, we propose the novel {\em table-sequence encoders} where two different encoders -- a table encoder and a sequence encoder are designed to help each other in the representation learning process. Our experiments confirm the advantages of having {\em two} encoders over {\em one} encoder. On several standard datasets, our model shows significant improvements over existing approaches.

PDF Abstract EMNLP 2020 PDF EMNLP 2020 Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Relation Extraction ACE 2004 Table-Sequence RE Micro F1 63.3 # 3
NER Micro F1 88.6 # 4
RE+ Micro F1 59.6 # 4
Cross Sentence No # 1
Relation Extraction ACE 2005 Table-Sequence RE Micro F1 67.6 # 5
NER Micro F1 89.5 # 5
RE+ Micro F1 64.3 # 5
Sentence Encoder ALBERT # 1
Cross Sentence No # 1
Relation Extraction Adverse Drug Events (ADE) Corpus Table-Sequence RE+ Macro F1 80.1 # 9
NER Macro F1 89.7 # 6
RE Macro F1 80.1 # 1
Relation Extraction CoNLL04 Table-Sequence NER Macro F1 86.9 # 2
RE+ Micro F1 73.6 # 3
RE+ Macro F1 75.4 # 2
NER Micro F1 90.1 # 4
Zero-shot Relation Triplet Extraction FewRel TableSequence Avg. F1 6.37 # 3
Zero-shot Relation Triplet Extraction Wiki-ZSL TableSequence Avg. F1 6.4 # 2

Methods


No methods listed for this paper. Add relevant methods here