Continual Learning of a Mixed Sequence of Similar and Dissimilar Tasks

NeurIPS 2020  ยท  Zixuan Ke, Bing Liu, Xingchang Huang ยท

Existing research on continual learning of a sequence of tasks focused on dealing with catastrophic forgetting, where the tasks are assumed to be dissimilar and have little shared knowledge. Some work has also been done to transfer previously learned knowledge to the new task when the tasks are similar and have shared knowledge. To the best of our knowledge, no technique has been proposed to learn a sequence of mixed similar and dissimilar tasks that can deal with forgetting and also transfer knowledge forward and backward. This paper proposes such a technique to learn both types of tasks in the same network. For dissimilar tasks, the algorithm focuses on dealing with forgetting, and for similar tasks, the algorithm focuses on selectively transferring the knowledge learned from some similar previous tasks to improve the new task learning. Additionally, the algorithm automatically detects whether a new task is similar to any previous tasks. Empirical evaluation using sequences of mixed tasks demonstrates the effectiveness of the proposed model.

PDF Abstract NeurIPS 2020 PDF NeurIPS 2020 Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Continual Learning 20Newsgroup (10 tasks) CAT F1 - macro 0.9516 # 3
Continual Learning ASC (19 tasks) CAT F1 - macro 0.6864 # 14
Continual Learning DSC (10 tasks) CAT F1 - macro 0.8651 # 2
Continual Learning F-CelebA (10 tasks) CAT (CNN backbone) Acc 0.7564 # 1
Continual Learning F-CelebA (10 tasks) CAT (MLP backbone) Acc 0.6909 # 2

Methods


No methods listed for this paper. Add relevant methods here