Coreference resolution is the task of clustering mentions in text that refer to the same underlying real world entities.
+-----------+ | | I voted for Obama because he was most aligned with my values", she said. | | | +-------------------------------------------------+------------+
"I", "my", and "she" belong to the same cluster and "Obama" and "he" belong to the same cluster.
|TREND||DATASET||BEST METHOD||PAPER TITLE||PAPER||CODE||COMPARE|
Transfer learning, where a model is first pre-trained on a data-rich task before being fine-tuned on a downstream task, has emerged as a powerful technique in natural language processing (NLP).
SOTA for Linguistic Acceptability on CoLA
COMMON SENSE REASONING COREFERENCE RESOLUTION DOCUMENT SUMMARIZATION LINGUISTIC ACCEPTABILITY MACHINE TRANSLATION NATURAL LANGUAGE INFERENCE QUESTION ANSWERING SEMANTIC TEXTUAL SIMILARITY SENTIMENT ANALYSIS TEXT CLASSIFICATION TRANSFER LEARNING WORD SENSE DISAMBIGUATION
We introduce a new type of deep contextualized word representation that models both (1) complex characteristics of word use (e. g., syntax and semantics), and (2) how these uses vary across linguistic contexts (i. e., to model polysemy).
#2 best model for Sentiment Analysis on SST-5 Fine-grained classification
We introduce Stanza, an open-source Python natural language processing toolkit supporting 66 human languages.
By contrast, humans can generally perform a new language task from only a few examples or from simple instructions - something which current NLP systems still largely struggle to do.
SOTA for Language Modelling on Penn Treebank (Word Level) (using extra training data)
COMMON SENSE REASONING COREFERENCE RESOLUTION DOMAIN ADAPTATION FEW-SHOT LEARNING LANGUAGE MODELLING NATURAL LANGUAGE INFERENCE QUESTION ANSWERING SENTENCE COMPLETION UNSUPERVISED MACHINE TRANSLATION WORD SENSE DISAMBIGUATION
We introduce a fully differentiable approximation to higher-order inference for coreference resolution.
#6 best model for Coreference Resolution on OntoNotes
We introduce the first end-to-end coreference resolution model and show that it significantly outperforms all previous work without using a syntactic parser or hand-engineered mention detector.
SOTA for Coreference Resolution on CoNLL 2012
We present SpanBERT, a pre-training method that is designed to better represent and predict spans of text.
SOTA for Question Answering on HotpotQA
Coreference resolution systems are typically trained with heuristic loss functions that require careful tuning.
#11 best model for Coreference Resolution on OntoNotes
A long-standing challenge in coreference resolution has been the incorporation of entity-level information - features defined over clusters of mentions instead of mention pairs.
#12 best model for Coreference Resolution on OntoNotes