Search Results for author: Daniel Becking

Found 4 papers, 1 papers with code

Adaptive Differential Filters for Fast and Communication-Efficient Federated Learning

no code implementations9 Apr 2022 Daniel Becking, Heiner Kirchhoffer, Gerhard Tech, Paul Haase, Karsten Müller, Heiko Schwarz, Wojciech Samek

Federated learning (FL) scenarios inherently generate a large communication overhead by frequently transmitting neural network updates between clients and server.

Federated Learning

ECQ$^{\text{x}}$: Explainability-Driven Quantization for Low-Bit and Sparse DNNs

no code implementations9 Sep 2021 Daniel Becking, Maximilian Dreyer, Wojciech Samek, Karsten Müller, Sebastian Lapuschkin

The remarkable success of deep neural networks (DNNs) in various applications is accompanied by a significant increase in network parameters and arithmetic operations.

Explainable Artificial Intelligence (XAI) Quantization

FantastIC4: A Hardware-Software Co-Design Approach for Efficiently Running 4bit-Compact Multilayer Perceptrons

no code implementations17 Dec 2020 Simon Wiedemann, Suhas Shivapakash, Pablo Wiedemann, Daniel Becking, Wojciech Samek, Friedel Gerfers, Thomas Wiegand

With the growing demand for deploying deep learning models to the "edge", it is paramount to develop techniques that allow to execute state-of-the-art models within very tight and limited resource constraints.

Quantization

Learning Sparse & Ternary Neural Networks with Entropy-Constrained Trained Ternarization (EC2T)

2 code implementations2 Apr 2020 Arturo Marban, Daniel Becking, Simon Wiedemann, Wojciech Samek

To address this problem, we propose Entropy-Constrained Trained Ternarization (EC2T), a general framework to create sparse and ternary neural networks which are efficient in terms of storage (e. g., at most two binary-masks and two full-precision values are required to save a weight matrix) and computation (e. g., MAC operations are reduced to a few accumulations plus two multiplications).

Image Classification

Cannot find the paper you are looking for? You can Submit a new open access paper.