Piggyback: Adapting a Single Network to Multiple Tasks by Learning to Mask Weights

This work presents a method for adapting a single, fixed deep neural network to multiple tasks without affecting performance on already learned tasks. By building upon ideas from network quantization and pruning, we learn binary masks that piggyback on an existing network, or are applied to unmodified weights of that network to provide good performance on a new task... (read more)

PDF Abstract ECCV 2018 PDF ECCV 2018 Abstract
TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK RESULT BENCHMARK
Continual Learning CUBS (Fine-grained 6 Tasks) Piggyback Accuracy 84.59 # 1
Continual Learning Flowers (Fine-grained 6 Tasks) Piggyback Accuracy 94.77 # 2
Continual Learning ImageNet (Fine-grained 6 Tasks) Piggyback Accuracy 76.16 # 1
Continual Learning Sketch (Fine-grained 6 Tasks) Piggyback Accuracy 79.91 # 2
Continual Learning Stanford Cars (Fine-grained 6 Tasks) Piggyback Accuracy 89.62 # 2
Continual Learning visual domain decathlon (10 tasks) Piggyback decathlon discipline (Score) 2838 # 8
Continual Learning Wikiart (Fine-grained 6 Tasks) Piggyback Accuracy 71.33 # 3

Methods used in the Paper


METHOD TYPE
🤖 No Methods Found Help the community by adding them if they're not listed; e.g. Deep Residual Learning for Image Recognition uses ResNet