Towards Redundancy-Free Sub-networks in Continual Learning

1 Dec 2023  ·  Cheng Chen, Jingkuan Song, Lianli Gao, Heng Tao Shen ·

Catastrophic Forgetting (CF) is a prominent issue in continual learning. Parameter isolation addresses this challenge by masking a sub-network for each task to mitigate interference with old tasks. However, these sub-networks are constructed relying on weight magnitude, which does not necessarily correspond to the importance of weights, resulting in maintaining unimportant weights and constructing redundant sub-networks. To overcome this limitation, inspired by information bottleneck, which removes redundancy between adjacent network layers, we propose \textbf{\underline{I}nformation \underline{B}ottleneck \underline{M}asked sub-network (IBM)} to eliminate redundancy within sub-networks. Specifically, IBM accumulates valuable information into essential weights to construct redundancy-free sub-networks, not only effectively mitigating CF by freezing the sub-networks but also facilitating new tasks training through the transfer of valuable knowledge. Additionally, IBM decomposes hidden representations to automate the construction process and make it flexible. Extensive experiments demonstrate that IBM consistently outperforms state-of-the-art methods. Notably, IBM surpasses the state-of-the-art parameter isolation method with a 70\% reduction in the number of parameters within sub-networks and an 80\% decrease in training time.

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Continual Learning CIFAR-100 AlexNet - 300 Epoch IBM Accuracy 82.69 # 1
Continual Learning CIFAR-100 ResNet-18 - 300 Epochs IBM Accuracy 88.15 # 1
Continual Learning MiniImageNet ResNet-18 - 300 Epochs IBM Accuracy 53.9 # 1
Continual Learning TinyImageNet ResNet-18 - 300 Epochs IBM Accuracy 52.38 # 1

Methods


No methods listed for this paper. Add relevant methods here