Semantics-aware BERT for Language Understanding

5 Sep 2019Zhuosheng ZhangYuwei WuHai ZhaoZuchao LiShuailiang ZhangXi ZhouXiang Zhou

The latest work on language representations carefully integrates contextualized features into language model training, which enables a series of success especially in various machine reading comprehension and natural language inference tasks. However, the existing language representation models including ELMo, GPT and BERT only exploit plain context-sensitive features such as character or word embeddings... (read more)

PDF Abstract

Results from the Paper


TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK RESULT BENCHMARK
Natural Language Inference SNLI SemBERT % Test Accuracy 91.9 # 1
% Train Accuracy 94.4 # 14
Parameters 339m # 2
Question Answering SQuAD2.0 SemBERT (single model) EM 84.800 # 82
F1 87.864 # 83
Question Answering SQuAD2.0 SemBERT(ensemble) EM 86.166 # 63
F1 88.886 # 71
Question Answering SQuAD2.0 SemBERT (ensemble) EM 86.166 # 63
F1 88.886 # 71
Question Answering SQuAD2.0 dev SemBERT large F1 83.6 # 8
EM 80.9 # 6

Methods used in the Paper