Search Results for author: Hans-Martin Will

Found 1 papers, 0 papers with code

WaLDORf: Wasteless Language-model Distillation On Reading-comprehension

no code implementations13 Dec 2019 James Yi Tian, Alexander P. Kreuzer, Pai-Hung Chen, Hans-Martin Will

Transformer based Very Large Language Models (VLLMs) like BERT, XLNet and RoBERTa, have recently shown tremendous performance on a large variety of Natural Language Understanding (NLU) tasks.

Language Modelling Natural Language Understanding +1

Cannot find the paper you are looking for? You can Submit a new open access paper.