DeLighT: Very Deep and Light-weight Transformer

We introduce a very deep and light-weight transformer, DeLighT, that delivers similar or better performance than transformer-based models with significantly fewer parameters. DeLighT more efficiently allocates parameters both (1) within each Transformer block using DExTra, a deep and light-weight transformation and (2) across blocks using block-wise scaling, that allows for shallower and narrower DeLighT blocks near the input and wider and deeper DeLighT blocks near the output... (read more)

PDF Abstract
TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK BENCHMARK
Machine Translation IWSLT2014 German-English DeLighT BLEU score 35.3 # 9
Language Modelling WikiText-103 DeLighT Test perplexity 24.14 # 22
Number of params 99M # 15
Machine Translation WMT2016 English-French DeLighT BLEU score 40.5 # 1
Machine Translation WMT2016 English-German DeLighT BLEU score 28.0 # 4
Machine Translation WMT2016 English-Romanian DeLighT BLEU score 34.7 # 1

Methods used in the Paper