Browse SoTA > Natural Language Processing > Word Sense Disambiguation

Word Sense Disambiguation

41 papers with code · Natural Language Processing

The task of Word Sense Disambiguation (WSD) consists of associating words in context with their most suitable entry in a pre-defined sense inventory. The de-facto sense inventory for English in WSD is WordNet. For example, given the word “mouse” and the following sentence:

“A mouse consists of an object held in one's hand, with one or more buttons.”

we would assign “mouse” with its electronic device sense (the 4th sense in the WordNet sense inventory).

Benchmarks

TREND DATASET BEST METHOD PAPER TITLE PAPER CODE COMPARE

Latest papers with code

Language Models are Few-Shot Learners

28 May 2020openai/gpt-3

By contrast, humans can generally perform a new language task from only a few examples or from simple instructions - something which current NLP systems still largely struggle to do.

 Ranked #1 on Language Modelling on Penn Treebank (Word Level) (using extra training data)

COMMON SENSE REASONING COREFERENCE RESOLUTION DOMAIN ADAPTATION FEW-SHOT LEARNING LANGUAGE MODELLING NATURAL LANGUAGE INFERENCE QUESTION ANSWERING SENTENCE COMPLETION UNSUPERVISED MACHINE TRANSLATION WORD SENSE DISAMBIGUATION

6,727
28 May 2020

An Evaluation Benchmark for Testing the Word Sense Disambiguation Capabilities of Machine Translation Systems

LREC 2020 Helsinki-NLP/MuCoW

Lexical ambiguity is one of the many challenging linguistic phenomena involved in translation, i. e., translating an ambiguous word with its correct sense.

MACHINE TRANSLATION WORD SENSE DISAMBIGUATION

8
01 May 2020

FlauBERT: Unsupervised Language Model Pre-training for French

LREC 2020 huggingface/transformers

Language models have become a key step to achieve state-of-the art results in many different Natural Language Processing (NLP) tasks.

LANGUAGE MODELLING NATURAL LANGUAGE INFERENCE TEXT CLASSIFICATION WORD SENSE DISAMBIGUATION

31,850
11 Dec 2019

Word-Class Embeddings for Multiclass Text Classification

26 Nov 2019AlexMoreo/word-class-embeddings

Pre-trained word embeddings encode general word semantics and lexical regularities of natural language, and have proven useful across many NLP tasks, including word sense disambiguation, machine translation, and sentiment analysis, to name a few.

MACHINE TRANSLATION SENTIMENT ANALYSIS TEXT CLASSIFICATION WORD EMBEDDINGS WORD SENSE DISAMBIGUATION

4
26 Nov 2019

Improved Word Sense Disambiguation Using Pre-Trained Contextualized Word Representations

IJCNLP 2019 nusnlp/contextemb-wsd

Contextualized word representations are able to give different representations for the same word in different contexts, and they have been shown to be effective in downstream natural language processing tasks, such as question answering, named entity recognition, and sentiment analysis.

NAMED ENTITY RECOGNITION QUESTION ANSWERING SENTIMENT ANALYSIS WORD EMBEDDINGS WORD SENSE DISAMBIGUATION

12
01 Oct 2019

Does BERT Make Any Sense? Interpretable Word Sense Disambiguation with Contextualized Embeddings

23 Sep 2019uhh-lt/bert-sense

Since vectors of the same word type can vary depending on the respective context, they implicitly provide a model for word sense disambiguation (WSD).

WORD SENSE DISAMBIGUATION

31
23 Sep 2019

GlossBERT: BERT for Word Sense Disambiguation with Gloss Knowledge

IJCNLP 2019 HSLCY/GlossBERT

Word Sense Disambiguation (WSD) aims to find the exact sense of an ambiguous word in a particular context.

WORD SENSE DISAMBIGUATION

41
20 Aug 2019

Zero-shot Word Sense Disambiguation using Sense Definition Embeddings

ACL 2019 malllabiisc/EWISE

To overcome this challenge, we propose Extended WSD Incorporating Sense Embeddings (EWISE), a supervised model to perform WSD by predicting over a continuous sense embedding space as opposed to a discrete label space.

KNOWLEDGE GRAPH EMBEDDING WORD SENSE DISAMBIGUATION ZERO-SHOT LEARNING

52
01 Jul 2019