PackNet: Adding Multiple Tasks to a Single Network by Iterative Pruning

This paper presents a method for adding multiple tasks to a single deep neural network while avoiding catastrophic forgetting. Inspired by network pruning techniques, we exploit redundancies in large deep networks to free up parameters that can then be employed to learn new tasks... (read more)

PDF Abstract CVPR 2018 PDF CVPR 2018 Abstract
TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK RESULT BENCHMARK
Continual Learning Cifar100 (20 tasks) PackNet Average Accuracy 67.5 # 3
Continual Learning CUBS (Fine-grained 6 Tasks) PackNet Accuracy 80.41 # 3
Continual Learning Flowers (Fine-grained 6 Tasks) PackNet Accuracy 93.04 # 4
Continual Learning ImageNet (Fine-grained 6 Tasks) PackNet Accuracy 75.71 # 3
Continual Learning Sketch (Fine-grained 6 Tasks) PackNet Accuracy 76.17 # 4
Continual Learning Stanford Cars (Fine-grained 6 Tasks) PackNet Accuracy 86.11 # 4
Continual Learning Wikiart (Fine-grained 6 Tasks) PackNet Accuracy 69.40 # 4

Methods used in the Paper


METHOD TYPE
🤖 No Methods Found Help the community by adding them if they're not listed; e.g. Deep Residual Learning for Image Recognition uses ResNet