Split-CIFAR-10
3 papers with code • 0 benchmarks • 0 datasets
Benchmarks
These leaderboards are used to track progress in Split-CIFAR-10
Most implemented papers
Self-Attention Meta-Learner for Continual Learning
In this paper, we propose a new method, named Self-Attention Meta-Learner (SAM), which learns a prior knowledge for continual learning that permits learning a sequence of tasks, while avoiding catastrophic forgetting.
Mixture-of-Variational-Experts for Continual Learning
One weakness of machine learning algorithms is the poor ability of models to solve new problems without forgetting previously acquired knowledge.
Negotiated Representations to Prevent Forgetting in Machine Learning Applications
By evaluating our method on these challenging datasets, we aim to showcase its potential for addressing catastrophic forgetting and improving the performance of neural networks in continual learning settings.