ParsBERT: Transformer-based Model for Persian Language Understanding

26 May 2020Mehrdad FarahaniMohammad GharachorlooMarzieh FarahaniMohammad Manthouri

The surge of pre-trained language models has begun a new era in the field of Natural Language Processing (NLP) by allowing us to build powerful language models. Among these models, Transformer-based models such as BERT have become increasingly popular due to their state-of-the-art performance... (read more)

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper