no code implementations • 13 Dec 2019 • James Yi Tian, Alexander P. Kreuzer, Pai-Hung Chen, Hans-Martin Will
Transformer based Very Large Language Models (VLLMs) like BERT, XLNet and RoBERTa, have recently shown tremendous performance on a large variety of Natural Language Understanding (NLU) tasks.