Does BERT Make Any Sense? Interpretable Word Sense Disambiguation with Contextualized Embeddings

23 Sep 2019Gregor WiedemannSteffen RemusAvi ChawlaChris Biemann

Contextualized word embeddings (CWE) such as provided by ELMo (Peters et al., 2018), Flair NLP (Akbik et al., 2018), or BERT (Devlin et al., 2019) are a major recent innovation in NLP. CWEs provide semantic vector representations of words depending on their respective context... (read more)

PDF Abstract
TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK RESULT BENCHMARK
Word Sense Disambiguation SemEval 2007 Task 17 kNN-BERT F1 60.94 # 8
Word Sense Disambiguation SemEval 2007 Task 17 kNN-BERT + POS (training corpus: SemCor) F1 63.17 # 7
Word Sense Disambiguation SemEval 2007 Task 7 kNN-BERT F1 81.20 # 8
Word Sense Disambiguation SemEval 2007 Task 7 kNN-BERT + POS (training corpus: WNGT) F1 85.32 # 3
Word Sense Disambiguation SensEval 2 Lexical Sample kNN-BERT F1 76.52 # 1
Word Sense Disambiguation SensEval 3 Lexical Sample kNN-BERT F1 80.12 # 1

Methods used in the Paper