The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks

14 Jun 2023  ·  Aaron Spieler, Nasim Rahaman, Georg Martius, Bernhard Schölkopf, Anna Levina ·

Biological cortical neurons are remarkably sophisticated computational devices, temporally integrating their vast synaptic input over an intricate dendritic tree, subject to complex, nonlinearly interacting internal biological processes. A recent study proposed to characterize this complexity by fitting accurate surrogate models to replicate the input-output relationship of a detailed biophysical cortical pyramidal neuron model and discovered it needed temporal convolutional networks (TCN) with millions of parameters. Requiring these many parameters, however, could stem from a misalignment between the inductive biases of the TCN and cortical neuron's computations. In light of this, and to explore the computational implications of leaky memory units and nonlinear dendritic processing, we introduce the Expressive Leaky Memory (ELM) neuron model, a biologically inspired phenomenological model of a cortical neuron. Remarkably, by exploiting such slowly decaying memory-like hidden states and two-layered nonlinear integration of synaptic input, our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters. To further assess the computational ramifications of our neuron design, we evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets, as well as a novel neuromorphic dataset based on the Spiking Heidelberg Digits dataset (SHD-Adding). Leveraging a larger number of memory units with sufficiently long timescales, and correspondingly sophisticated synaptic integration, the ELM neuron displays substantial long-range processing capabilities, reliably outperforming the classic Transformer or Chrono-LSTM architectures on LRA, and even solving the Pathfinder-X task with over 70% accuracy (16k context length).

PDF Abstract

Datasets


Introduced in the Paper:

SHD - Adding

Used in the Paper:

LRA SHD neuronIO

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Long-range modeling LRA Chrono-LSTM ListOps 44.55 # 15
Text 75.4 # 17
Retrieval 82.87 # 16
Image 46.09 # 17
Pathfinder 70.79 # 25
Avg 63.94 # 16
Long-range modeling LRA ELM Neuron ListOps 46.77 # 14
Text 80.3 # 14
Retrieval 84.93 # 14
Image 49.62 # 15
Pathfinder 71.15 # 24
Avg 68.34 # 14
Pathfinder-X 77.29 # 13
Time Series neuronIO ELM Neuron Parameters (K) @ 0.991 AUC 52.92 # 2
Time Series neuronIO B-ELM Neuron Parameters (K) @ 0.991 AUC 8.1 # 1
Time Series neuronIO LSTM Parameters (K) @ 0.991 AUC 265.9 # 3
Classification SHD - Adding ELM Neuron Accuracy (%) 82 # 1
Classification SHD - Adding LIF-SNN Accuracy (%) FAIL # 3
Classification SHD - Adding LSTM Accuracy (%) 10 # 2

Methods