Search Results for author: Axel Laborieux

Found 8 papers, 6 papers with code

Improving equilibrium propagation without weight symmetry through Jacobian homeostasis

1 code implementation5 Sep 2023 Axel Laborieux, Friedemann Zenke

Equilibrium propagation (EP) is a compelling alternative to the backpropagation of error algorithm (BP) for computing gradients of neural networks on biological or analog neuromorphic substrates.

Implicit variance regularization in non-contrastive SSL

1 code implementation NeurIPS 2023 Manu Srinath Halvagal, Axel Laborieux, Friedemann Zenke

To gain further theoretical insight into non-contrastive SSL, we analytically study learning dynamics in conjunction with Euclidean and cosine similarity in the eigenspace of closed-form linear predictor networks.

Holomorphic Equilibrium Propagation Computes Exact Gradients Through Finite Size Oscillations

1 code implementation1 Sep 2022 Axel Laborieux, Friedemann Zenke

Equilibrium propagation (EP) is an alternative to backpropagation (BP) that allows the training of deep neural networks with local learning rules.

Model of the Weak Reset Process in HfOx Resistive Memory for Deep Learning Frameworks

no code implementations2 Jul 2021 Atreya Majumdar, Marc Bocquet, Tifenn Hirtzlin, Axel Laborieux, Jacques-Olivier Klein, Etienne Nowak, Elisa Vianello, Jean-Michel Portal, Damien Querlioz

However, the resistive change behavior in this regime suffers many fluctuations and is particularly challenging to model, especially in a way compatible with tools used for simulating deep learning.

Handwritten Digit Recognition

Synaptic metaplasticity in binarized neural networks

2 code implementations19 Jan 2021 Axel Laborieux, Maxence Ernoult, Tifenn Hirtzlin, Damien Querlioz

Unlike the brain, artificial neural networks, including state-of-the-art deep neural networks for computer vision, are subject to "catastrophic forgetting": they rapidly forget the previous task when trained on a new one.

Scaling Equilibrium Propagation to Deep ConvNets by Drastically Reducing its Gradient Estimator Bias

no code implementations14 Jan 2021 Axel Laborieux, Maxence Ernoult, Benjamin Scellier, Yoshua Bengio, Julie Grollier, Damien Querlioz

Equilibrium Propagation (EP) is a biologically-inspired counterpart of Backpropagation Through Time (BPTT) which, owing to its strong theoretical guarantees and the locality in space of its learning rule, fosters the design of energy-efficient hardware dedicated to learning.

Scaling Equilibrium Propagation to Deep ConvNets by Drastically Reducing its Gradient Estimator Bias

1 code implementation6 Jun 2020 Axel Laborieux, Maxence Ernoult, Benjamin Scellier, Yoshua Bengio, Julie Grollier, Damien Querlioz

In this work, we show that a bias in the gradient estimate of EP, inherent in the use of finite nudging, is responsible for this phenomenon and that cancelling it allows training deep ConvNets by EP.

Synaptic Metaplasticity in Binarized Neural Networks

1 code implementation7 Mar 2020 Axel Laborieux, Maxence Ernoult, Tifenn Hirtzlin, Damien Querlioz

In this work, we interpret the hidden weights used by binarized neural networks, a low-precision version of deep neural networks, as metaplastic variables, and modify their training technique to alleviate forgetting.

Cannot find the paper you are looking for? You can Submit a new open access paper.