Browse SoTA > Natural Language Processing > Coreference Resolution

# Coreference Resolution Edit

89 papers with code · Natural Language Processing

Coreference resolution is the task of clustering mentions in text that refer to the same underlying real world entities.

Example:

               +-----------+
|           |
I voted for Obama because he was most aligned with my values", she said.
|                                                 |            |
+-------------------------------------------------+------------+


"I", "my", and "she" belong to the same cluster and "Obama" and "he" belong to the same cluster.

TREND DATASET BEST METHOD PAPER TITLE PAPER CODE COMPARE

# Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer

Transfer learning, where a model is first pre-trained on a data-rich task before being fine-tuned on a downstream task, has emerged as a powerful technique in natural language processing (NLP).

35,725

# Deep contextualized word representations

We introduce a new type of deep contextualized word representation that models both (1) complex characteristics of word use (e. g., syntax and semantics), and (2) how these uses vary across linguistic contexts (i. e., to model polysemy).

9,488

# Language Models are Few-Shot Learners

By contrast, humans can generally perform a new language task from only a few examples or from simple instructions - something which current NLP systems still largely struggle to do.

8,886

7,580

# Stanza: A Python Natural Language Processing Toolkit for Many Human Languages

We introduce Stanza, an open-source Python natural language processing toolkit supporting 66 human languages.

4,772

# SpanBERT: Improving Pre-training by Representing and Predicting Spans

We present SpanBERT, a pre-training method that is designed to better represent and predict spans of text.

442

# Higher-order Coreference Resolution with Coarse-to-fine Inference

We introduce a fully differentiable approximation to higher-order inference for coreference resolution.

404

# End-to-end Neural Coreference Resolution

We introduce the first end-to-end coreference resolution model and show that it significantly outperforms all previous work without using a syntactic parser or hand-engineered mention detector.

404

# Deep Reinforcement Learning for Mention-Ranking Coreference Models

Coreference resolution systems are typically trained with heuristic loss functions that require careful tuning.

239

# Improving Coreference Resolution by Learning Entity-Level Distributed Representations

A long-standing challenge in coreference resolution has been the incorporation of entity-level information - features defined over clusters of mentions instead of mention pairs.

239