Search Results for author: Guangyue Peng

Found 1 papers, 0 papers with code

Semiparametric Language Models Are Scalable Continual Learners

no code implementations2 Mar 2023 Guangyue Peng, Tao Ge, Si-Qing Chen, Furu Wei, Houfeng Wang

We demonstrate that SeMem improves the scalability of semiparametric LMs for continual learning over streaming data in two ways: (1) data-wise scalability: as the model becomes stronger through continual learning, it will encounter fewer difficult cases that need to be memorized, causing the growth of the non-parametric memory to slow down over time rather than growing at a linear rate with the size of training data; (2) model-wise scalability: SeMem allows a larger model to memorize fewer samples than its smaller counterpart because it is rarer for a larger model to encounter incomprehensible cases, resulting in a non-parametric memory that does not scale linearly with model size.

Continual Learning Language Modelling +1

Cannot find the paper you are looking for? You can Submit a new open access paper.