Highway-Connection Classifier Networks for Plastic yet Stable Continual Learning

Catastrophic forgetting occurs when a neural network is trained sequentially on multiple tasks – its weights will be continuously modified and as a result, the network will lose its ability in solving a previous task. Many studies have proposed different techniques to prevent a base learner from forgetting. This paper provides a new perspective and introduces Highway-Connection Classifier Networks (HCNs) to complement existing continual learning techniques. When implemented alone, HCNs exhibited strong robustness against forgetting; and when combined with continual learning techniques, the combined results outperformed their baselines. Furthermore, our experiments found that HCNs achieved competitive performances when only their shallowest layer was subjected to continual learning techniques; implying that continual learning tuning could be limited to specific layers.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here