Domain-Specific Language Model Pretraining for Biomedical Natural Language Processing

31 Jul 2020Yu GuRobert TinnHao ChengMichael LucasNaoto UsuyamaXiaodong LiuTristan NaumannJianfeng GaoHoifung Poon

Pretraining large neural language models, such as BERT, has led to impressive gains on many natural language processing (NLP) tasks. However, most pretraining efforts focus on general domain corpora, such as newswire and Web... (read more)

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper