Word Sense Disambiguation

142 papers with code • 15 benchmarks • 15 datasets

The task of Word Sense Disambiguation (WSD) consists of associating words in context with their most suitable entry in a pre-defined sense inventory. The de-facto sense inventory for English in WSD is WordNet. For example, given the word “mouse” and the following sentence:

“A mouse consists of an object held in one's hand, with one or more buttons.”

we would assign “mouse” with its electronic device sense (the 4th sense in the WordNet sense inventory).

Libraries

Use these libraries to find Word Sense Disambiguation models and implementations

Together We Make Sense -- Learning Meta-Sense Embeddings from Pretrained Static Sense Embeddings

livnlp/npms 30 May 2023

Our proposed method can combine source sense embeddings that cover different sets of word senses.

3
30 May 2023

Adversarial Multi-task Learning for End-to-end Metaphor Detection

silasthu/admul 26 May 2023

We leverage adversarial training to align the data distributions of MD and BSD in the same feature space, so task-invariant representations can be learned.

1
26 May 2023

The CoT Collection: Improving Zero-shot and Few-shot Learning of Language Models via Chain-of-Thought Fine-Tuning

kaist-lklab/cot-collection 23 May 2023

Furthermore, we show that instruction tuning with CoT Collection allows LMs to possess stronger few-shot learning capabilities on 4 domain-specific tasks, resulting in an improvement of +2. 24% (Flan-T5 3B) and +2. 37% (Flan-T5 11B), even outperforming ChatGPT utilizing demonstrations until the max length by a +13. 98% margin.

189
23 May 2023

Ambiguity Meets Uncertainty: Investigating Uncertainty Estimation for Word Sense Disambiguation

ryanliut/wsd-ue 22 May 2023

Word sense disambiguation (WSD), which aims to determine an appropriate sense for a target word given its context, is crucial for natural language understanding.

1
22 May 2023

Knowledge-Design: Pushing the Limit of Protein Design via Knowledge Refinement

A4Bio/OpenCPD 20 May 2023

After witnessing the great success of pretrained models on diverse protein-related tasks and the fact that recovery is highly correlated with confidence, we wonder whether this knowledge can push the limits of protein design further.

141
20 May 2023

CWTM: Leveraging Contextualized Word Embeddings from BERT for Neural Topic Modeling

fitz-like-coding/cwtm 16 May 2023

Most existing topic models rely on bag-of-words (BOW) representation, which limits their ability to capture word order information and leads to challenges with out-of-vocabulary (OOV) words in new documents.

2
16 May 2023

Perturbation-based QE: An Explainable, Unsupervised Word-level Quality Estimation Method for Blackbox Machine Translation

tuanh23/perturbation-basedqe 12 May 2023

Quality Estimation (QE) is the task of predicting the quality of Machine Translation (MT) system output, without using any gold-standard translation references.

2
12 May 2023

Context-Aware Semantic Similarity Measurement for Unsupervised Word Sense Disambiguation

jorge-martinez-gil/uwsd 5 May 2023

The issue of word sense ambiguity poses a significant challenge in natural language processing due to the scarcity of annotated data to feed machine learning models to face the challenge.

2
05 May 2023

Vision Meets Definitions: Unsupervised Visual Word Sense Disambiguation Incorporating Gloss Information

soon91jae/uvwsd 2 May 2023

Specifically, we suggest employing Bayesian inference to incorporate the sense definitions when sense information of the answer is not provided.

4
02 May 2023

LaMini-LM: A Diverse Herd of Distilled Models from Large-Scale Instructions

mbzuai-nlp/lamini-lm 27 Apr 2023

The results demonstrate that our proposed LaMini-LM models are comparable to competitive baselines, while being much smaller in size.

800
27 Apr 2023