Search Results for author: Koichi Nagatsuka

Found 1 papers, 0 papers with code

Pre-training a BERT with Curriculum Learning by Increasing Block-Size of Input Text

no code implementations RANLP 2021 Koichi Nagatsuka, Clifford Broni-Bediako, Masayasu Atsumi

Recently, pre-trained language representation models such as BERT and RoBERTa have achieved significant results in a wide range of natural language processing (NLP) tasks, however, it requires extremely high computational cost.

Cannot find the paper you are looking for? You can Submit a new open access paper.