Contrastive-Signal-Dependent Plasticity: Forward-Forward Learning of Spiking Neural Systems

30 Mar 2023  ·  Alexander Ororbia ·

We develop a neuro-mimetic architecture, composed of spiking neuronal units, where individual layers of neurons operate in parallel and adapt their synaptic efficacies without the use of feedback pathways. Specifically, we propose an event-based generalization of forward-forward learning, which we call contrastive-signal-dependent plasticity (CSDP), for a spiking neural system that iteratively processes sensory input over a stimulus window. The dynamics that underwrite this recurrent circuit entail computing the membrane potential of each processing element, in each layer, as a function of local bottom-up, top-down, and lateral signals, facilitating a dynamic, layer-wise parallel form of neural computation. Unlike other models, such as spiking predictive coding, which rely on feedback synapses to adjust neural electrical activity, our model operates purely online and forward in time, offering a promising way to learn distributed representations of sensory data patterns, with and without labeled context information. Notably, our experimental results on several pattern datasets demonstrate that the CSDP process works well for training a dynamic recurrent spiking network capable of both classification and reconstruction.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here