COVID-Twitter-BERT: A Natural Language Processing Model to Analyse COVID-19 Content on Twitter

15 May 2020Martin MüllerMarcel SalathéPer E Kummervold

In this work, we release COVID-Twitter-BERT (CT-BERT), a transformer-based model, pretrained on a large corpus of Twitter messages on the topic of COVID-19. Our model shows a 10-30% marginal improvement compared to its base model, BERT-Large, on five different classification datasets... (read more)

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper