Relation Classification is the task of identifying the semantic relation holding between two nominal entities in text.
We present FewRel 2. 0, a more challenging task to investigate two aspects of few-shot relation classification models: (1) Can they adapt to a new domain with only a handful of instances?
We present a novel end-to-end neural model to extract entities and relations between them.
Ranked #3 on Relation Extraction on ACE 2004
The model is trained using strong within-sentence negative samples, which are efficiently extracted in a single BERT pass.
Ranked #1 on Relation Extraction on CoNLL04
In this paper, we propose a novel model for relation classification at the sentence level from noisy data.
Our model not only utilizes entities and their latent types as features effectively but also is more interpretable by visualizing attention mechanisms applied to our model and results of LET.
Ranked #10 on Relation Extraction on SemEval-2010 Task 8
In this paper, we propose a model that both leverages the pre-trained BERT language model and incorporates information from the target entities to tackle the relation classification task.
Ranked #3 on Relation Extraction on TACRED
Distantly supervised relation extraction is widely used to extract relational facts from text, but suffers from noisy labels.
The relation of each sentence is first recognized by distant supervision methods, and then filtered by crowdworkers.
The reported results in the shared task bring this submission to the third place on subtask 1 (word relatedness), and the first place on subtask 2 (semantic relation classification), demonstrating the utility of integrating the complementary path-based and distributional information sources in recognizing concrete semantic relations.