Search Results for author: Hilla Ben Yaacov

Found 3 papers, 0 papers with code

Logarithmic Unbiased Quantization: Simple 4-bit Training in Deep Learning

no code implementations19 Dec 2021 Brian Chmiel, Ron Banner, Elad Hoffer, Hilla Ben Yaacov, Daniel Soudry

Based on this, we suggest a \textit{logarithmic unbiased quantization} (LUQ) method to quantize all both the forward and backward phase to 4-bit, achieving state-of-the-art results in 4-bit training without overhead.

Quantization

Beyond Quantization: Power aware neural networks

no code implementations29 Sep 2021 Nurit Spingarn, Elad Hoffer, Ron Banner, Hilla Ben Yaacov, Tomer Michaeli

Power consumption is a major obstacle in the deployment of deep neural networks (DNNs) on end devices.

Quantization

Logarithmic Unbiased Quantization: Practical 4-bit Training in Deep Learning

no code implementations29 Sep 2021 Brian Chmiel, Ron Banner, Elad Hoffer, Hilla Ben Yaacov, Daniel Soudry

Based on this, we suggest a logarithmic unbiased quantization (LUQ) method to quantize both the forward and backward phase to 4-bit, achieving state-of-the-art results in 4-bit training.

Quantization

Cannot find the paper you are looking for? You can Submit a new open access paper.