Pre-training Polish Transformer-based Language Models at Scale

7 Jun 2020Sławomir DadasMichał PerełkiewiczRafał Poświata

Transformer-based language models are now widely used in Natural Language Processing (NLP). This statement is especially true for English language, in which many pre-trained models utilizing transformer-based architecture have been published in recent years... (read more)

PDF Abstract

Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper