Transferring Monolingual Model to Low-Resource Language: The Case of Tigrinya

13 Jun 2020Abrhalei TelaAbraham WoubieVille Hautamaki

In recent years, transformer models have achieved great success in natural language processing (NLP) tasks. Most of the current state-of-the-art NLP results are achieved by using monolingual transformer models, where the model is pre-trained using a single language unlabelled text corpus... (read more)

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper