no code implementations • RANLP 2021 • Koichi Nagatsuka, Clifford Broni-Bediako, Masayasu Atsumi
Recently, pre-trained language representation models such as BERT and RoBERTa have achieved significant results in a wide range of natural language processing (NLP) tasks, however, it requires extremely high computational cost.