Search Results for author: Gido M. van de Ven

Found 15 papers, 10 papers with code

Continual Learning and Catastrophic Forgetting

no code implementations8 Mar 2024 Gido M. van de Ven, Nicholas Soures, Dhireesha Kudithipudi

This book chapter delves into the dynamics of continual learning, which is the process of incrementally learning from a non-stationary stream of data.

Continual Learning

Infinite dSprites for Disentangled Continual Learning: Separating Memory Edits from Generalization

1 code implementation27 Dec 2023 Sebastian Dziadzio, Çağatay Yıldız, Gido M. van de Ven, Tomasz Trzciński, Tinne Tuytelaars, Matthias Bethge

In a simple setting with direct supervision on the generative factors, we show how learning class-agnostic transformations offers a way to circumvent catastrophic forgetting and improve classification accuracy over time.

Classification Continual Learning +3

Continual Learning of Diffusion Models with Generative Distillation

1 code implementation23 Nov 2023 Sergi Masip, Pau Rodriguez, Tinne Tuytelaars, Gido M. van de Ven

We demonstrate that our approach significantly improves the continual learning performance of generative replay with only a moderate increase in the computational costs.

Continual Learning Denoising +1

Two Complementary Perspectives to Continual Learning: Ask Not Only What to Optimize, But Also How

no code implementations8 Nov 2023 Timm Hess, Tinne Tuytelaars, Gido M. van de Ven

Recent years have seen considerable progress in the continual training of deep neural networks, predominantly thanks to approaches that add replay or regularization terms to the loss function to approximate the joint loss over all tasks so far.

Continual Learning

Prediction Error-based Classification for Class-Incremental Learning

1 code implementation30 May 2023 Michał Zając, Tinne Tuytelaars, Gido M. van de Ven

Class-incremental learning (CIL) is a particularly challenging variant of continual learning, where the goal is to learn to discriminate between all classes presented in an incremental fashion.

Classification Class Incremental Learning +1

Knowledge Accumulation in Continually Learned Representations and the Issue of Feature Forgetting

no code implementations3 Apr 2023 Timm Hess, Eli Verwimp, Gido M. van de Ven, Tinne Tuytelaars

Carefully taking both aspects into account, we show that, even though it is true that feature forgetting can be small in absolute terms, newly learned information tends to be forgotten just as catastrophically at the level of the representation as it is at the output level.

Continual Learning Image Classification +2

Natural continual learning: success is a journey, not (just) a destination

1 code implementation NeurIPS 2021 Ta-Chu Kao, Kristopher T. Jensen, Gido M. van de Ven, Alberto Bernacchia, Guillaume Hennequin

In contrast, artificial agents are prone to 'catastrophic forgetting' whereby performance on previous tasks deteriorates rapidly as new ones are acquired.

Continual Learning

Class-Incremental Learning with Generative Classifiers

1 code implementation20 Apr 2021 Gido M. van de Ven, Zhe Li, Andreas S. Tolias

As a proof-of-principle, here we implement this strategy by training a variational autoencoder for each class to be learned and by using importance sampling to estimate the likelihoods p(x|y).

Class Incremental Learning Incremental Learning

Energy-Based Models for Continual Learning

1 code implementation24 Nov 2020 Shuang Li, Yilun Du, Gido M. van de Ven, Igor Mordatch

We motivate Energy-Based Models (EBMs) as a promising model class for continual learning problems.

Continual Learning

Brain-inspired replay for continual learning with artificial neural networks

1 code implementation13 Aug 2020 Gido M. van de Ven, Hava T. Siegelmann & Andreas S. Tolias

In artificial neural networks, such memory replay can be implemented as ‘generative replay’, which can successfully – and surprisingly efficiently – prevent catastrophic forgetting on toy examples even in a class-incremental learning scenario.

Class Incremental Learning Incremental Learning

Omnidirectional Transfer for Quasilinear Lifelong Learning

1 code implementation27 Apr 2020 Joshua T. Vogelstein, Jayanta Dey, Hayden S. Helm, Will LeVine, Ronak D. Mehta, Ali Geisa, Haoyin Xu, Gido M. van de Ven, Emily Chang, Chenyu Gao, Weiwei Yang, Bryan Tower, Jonathan Larson, Christopher M. White, Carey E. Priebe

But striving to avoid forgetting sets the goal unnecessarily low: the goal of lifelong learning, whether biological or artificial, should be to improve performance on all tasks (including past and future) with any new data.

Federated Learning Transfer Learning

Three scenarios for continual learning

8 code implementations15 Apr 2019 Gido M. van de Ven, Andreas S. Tolias

Standard artificial neural networks suffer from the well-known issue of catastrophic forgetting, making continual or lifelong learning difficult for machine learning.

Class Incremental Learning Incremental Learning +1

Generative replay with feedback connections as a general strategy for continual learning

5 code implementations27 Sep 2018 Gido M. van de Ven, Andreas S. Tolias

A major obstacle to developing artificial intelligence applications capable of true lifelong learning is that artificial neural networks quickly or catastrophically forget previously learned tasks when trained on a new one.

Continual Learning Permuted-MNIST

Three continual learning scenarios and a case for generative replay

no code implementations27 Sep 2018 Gido M. van de Ven, Andreas S. Tolias

To enable more meaningful comparisons, we identified three distinct continual learning scenarios based on whether task identity is known and, if it is not, whether it needs to be inferred.

Continual Learning Permuted-MNIST

Cannot find the paper you are looking for? You can Submit a new open access paper.