RoBERTa is an extension of BERT with changes to the pretraining procedure. The modifications include:
Paper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Language Modelling | 76 | 9.26% |
Sentence | 58 | 7.06% |
Sentiment Analysis | 41 | 4.99% |
Question Answering | 32 | 3.90% |
Text Classification | 32 | 3.90% |
Classification | 24 | 2.92% |
Natural Language Understanding | 17 | 2.07% |
Named Entity Recognition (NER) | 16 | 1.95% |
NER | 15 | 1.83% |