no code implementations • COLING 2020 • Ga{\'e}tan Baert, Souhir Gahbiche, Guillaume Gadek, Alexandre Pauchet
We show that a language model (BAERT) pre-trained on a large corpus (LAD) in the same language (Arabizi) as that of the fine-tuning dataset (SALAD), outperforms a state-of-the-art multi-lingual pretrained model (multilingual BERT) on a sentiment analysis task.