no code implementations • 19 Feb 2022 • Tommi Gröndahl, Yujia Guo, N. Asokan
To facilitate this, we experiment on four sequence modelling tasks on the T5 Transformer in two experiment settings: zero-shot generalization, and generalization across class-specific vocabularies flipped between the training and test set.