Search Results for author: Ryuji Saiin

Found 6 papers, 1 papers with code

Magic for the Age of Quantized DNNs

no code implementations22 Mar 2024 Yoshihide Sawada, Ryuji Saiin, Kazuma Suetake

Recently, the number of parameters in DNNs has explosively increased, as exemplified by LLMs (Large Language Models), making inference on small-scale computers more difficult.

Model Compression Quantization

Spike Accumulation Forwarding for Effective Training of Spiking Neural Networks

no code implementations4 Oct 2023 Ryuji Saiin, Tomoya Shirakawa, Sota Yoshihara, Yoshihide Sawada, Hiroyuki Kusumoto

Our proposed method can solve these problems; namely, SAF can halve the number of operations during the forward process, and it can be theoretically proven that SAF is consistent with the Spike Representation and OTTT, respectively.

Rethinking the role of normalization and residual blocks for spiking neural networks

no code implementations3 Mar 2022 Shin-ichi Ikegawa, Ryuji Saiin, Yoshihide Sawada, Naotake Natori

Biologically inspired spiking neural networks (SNNs) are widely used to realize ultralow-power energy consumption.

S$^3$NN: Time Step Reduction of Spiking Surrogate Gradients for Training Energy Efficient Single-Step Spiking Neural Networks

no code implementations26 Jan 2022 Kazuma Suetake, Shin-ichi Ikegawa, Ryuji Saiin, Yoshihide Sawada

To solve these problems, we propose a single-step spiking neural network (S$^3$NN), an energy-efficient neural network with low computational cost and high precision.

Efficient Neural Network Time Series +1

Spectral Pruning for Recurrent Neural Networks

1 code implementation23 May 2021 Takashi Furuya, Kazuma Suetake, Koichi Taniguchi, Hiroyuki Kusumoto, Ryuji Saiin, Tomohiro Daimon

Recurrent neural networks (RNNs) are a class of neural networks used in sequential tasks.

Edge-computing

Cannot find the paper you are looking for? You can Submit a new open access paper.