no code implementations • ACL (RepL4NLP) 2021 • Seungwon Kim, Alex Shum, Nathan Susanj, Jonathan Hilgart
Pretrained language models have served as the backbone for many state-of-the-art NLP results.
Continual Pretraining Transfer Learning