Neural Relation Extraction via Inner-Sentence Noise Reduction and Transfer Learning

Extracting relations is critical for knowledge base completion and construction in which distant supervised methods are widely used to extract relational facts automatically with the existing knowledge bases. However, the automatically constructed datasets comprise amounts of low-quality sentences containing noisy words, which is neglected by current distant supervised methods resulting in unacceptable precisions. To mitigate this problem, we propose a novel word-level distant supervised approach for relation extraction. We first build Sub-Tree Parse(STP) to remove noisy words that are irrelevant to relations. Then we construct a neural network inputting the sub-tree while applying the entity-wise attention to identify the important semantic features of relational words in each instance. To make our model more robust against noisy words, we initialize our network with a priori knowledge learned from the relevant task of entity classification by transfer learning. We conduct extensive experiments using the corpora of New York Times(NYT) and Freebase. Experiments show that our approach is effective and improves the area of Precision/Recall(PR) from 0.35 to 0.39 over the state-of-the-art work.

PDF Abstract EMNLP 2018 PDF EMNLP 2018 Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Relationship Extraction (Distant Supervised) New York Times Corpus BiGRU+WLA+EWA AUC 0.390 # 3
Average Precision 0.390 # 1
Relationship Extraction (Distant Supervised) New York Times Corpus BGRU-SET AUC 0.390 # 3
Average Precision 0.390 # 1

Methods


No methods listed for this paper. Add relevant methods here