CharacterBERT: Reconciling ELMo and BERT for Word-Level Open-Vocabulary Representations From Characters

20 Oct 2020 • Hicham El Boukkouri • Olivier Ferret • Thomas Lavergne • Hiroshi Noji • Pierre Zweigenbaum • Junichi Tsujii

Due to the compelling improvements brought by BERT, many recent representation models adopted the Transformer architecture as their main building block, consequently inheriting the wordpiece tokenization system despite it not being intrinsically linked to the notion of Transformers. While this system is thought to achieve a good balance between the flexibility of characters and the efficiency of full words, using predefined wordpiece vocabularies from the general domain is not always suitable, especially when building models for specialized domains (e.g., the medical domain)... (read more)

PDF Abstract

Results from the Paper


TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK RESULT BENCHMARK
Clinical Concept Extraction 2010 i2b2/VA CharacterBERT (base, medical) Exact Span F1 89.24 # 2
Relation Extraction ChemProt CharacterBERT (base, medical) Micro F1 73.44 # 1
Semantic Similarity ClinicalSTS CharacterBERT (base, medical, ensemble) Pearson Correlation 85.62 # 1
Drug–drug interaction extraction DDI extraction 2013 corpus CharacterBERT (base, medical) Micro F1 80.38 # 1
Natural Language Inference MedNLI CharacterBERT (base, medical) Accuracy 84.95 # 1

Methods used in the Paper