Towards Learning to Remember in Meta Learning of Sequential Domains

1 Jan 2021  ·  Zhenyi Wang, Tiehang Duan, Donglin Zhan, Changyou Chen ·

Meta learning has made rapid progress in the past few years, with a recent extension to a continual learning setting. However, a natural generalization to the sequential domain setting to avoid catastrophe forgetting has not been well investigated. We found through extensive empirical verification that existing techniques to avoid catastrophe forgetting do not work well in our sequential domain meta learning setting. To tackle the problem, we propose a simple double-adaptation process for both parameters and learning rates. Adaptation on parameters ensures good generalization performance, while adaptation on learning rates is made avoid catastrophe forgetting of past domains. Extensive experiments on a sequence of commonly used real-domain data demonstrate the effectiveness of our proposed method, outperforming current state-of-art techniques in continual learning. Our code is made publicly available online (anonymous).

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here