Search Results for author: Jiangrong Shen

Found 6 papers, 0 papers with code

Enhancing Adaptive History Reserving by Spiking Convolutional Block Attention Module in Recurrent Neural Networks

no code implementations NeurIPS 2023 Qi Xu, Yuyuan Gao, Jiangrong Shen, Yaxin Li, Xuming Ran, Huajin Tang, Gang Pan

Spiking neural networks (SNNs) serve as one type of efficient model to process spatio-temporal patterns in time series, such as the Address-Event Representation data collected from Dynamic Vision Sensor (DVS).

Time Series

Neuromorphic Auditory Perception by Neural Spiketrum

no code implementations11 Sep 2023 Huajin Tang, Pengjie Gu, Jayawan Wijekoon, MHD Anas Alsakkal, ZiMing Wang, Jiangrong Shen, Rui Yan

Neuromorphic computing holds the promise to achieve the energy efficiency and robust learning performance of biological neural systems.

ESL-SNNs: An Evolutionary Structure Learning Strategy for Spiking Neural Networks

no code implementations6 Jun 2023 Jiangrong Shen, Qi Xu, Jian K. Liu, Yueming Wang, Gang Pan, Huajin Tang

To take full advantage of low power consumption and improve the efficiency of these models further, the pruning methods have been explored to find sparse SNNs without redundancy connections after training.

Biologically inspired structure learning with reverse knowledge distillation for spiking neural networks

no code implementations19 Apr 2023 Qi Xu, Yaxin Li, Xuanye Fang, Jiangrong Shen, Jian K. Liu, Huajin Tang, Gang Pan

The proposed method explores a novel dynamical way for structure learning from scratch in SNNs which could build a bridge to close the gap between deep learning and bio-inspired neural dynamics.

Knowledge Distillation

LaSNN: Layer-wise ANN-to-SNN Distillation for Effective and Efficient Training in Deep Spiking Neural Networks

no code implementations17 Apr 2023 Di Hong, Jiangrong Shen, Yu Qi, Yueming Wang

A conversion scheme is proposed to obtain competitive accuracy by mapping trained ANNs' parameters to SNNs with the same structures.

Knowledge Distillation

Constructing Deep Spiking Neural Networks from Artificial Neural Networks with Knowledge Distillation

no code implementations CVPR 2023 Qi Xu, Yaxin Li, Jiangrong Shen, Jian K Liu, Huajin Tang, Gang Pan

Spiking neural networks (SNNs) are well known as the brain-inspired models with high computing efficiency, due to a key component that they utilize spikes as information units, close to the biological neural systems.

Knowledge Distillation

Cannot find the paper you are looking for? You can Submit a new open access paper.