Search Results for author: Di Hong

Found 1 papers, 0 papers with code

LaSNN: Layer-wise ANN-to-SNN Distillation for Effective and Efficient Training in Deep Spiking Neural Networks

no code implementations17 Apr 2023 Di Hong, Jiangrong Shen, Yu Qi, Yueming Wang

A conversion scheme is proposed to obtain competitive accuracy by mapping trained ANNs' parameters to SNNs with the same structures.

Knowledge Distillation

Cannot find the paper you are looking for? You can Submit a new open access paper.