Transferring Monolingual Model to Low-Resource Language: The Case of Tigrinya

In recent years, transformer models have achieved great success in natural language processing (NLP) tasks. Most of the current state-of-the-art NLP results are achieved by using monolingual transformer models, where the model is pre-trained using a single language unlabelled text corpus... (read more)

Results in Papers With Code
(↓ scroll down to see all results)