Search Results for author: Lukas Geiger

Found 3 papers, 2 papers with code

Larq Compute Engine: Design, Benchmark, and Deploy State-of-the-Art Binarized Neural Networks

1 code implementation18 Nov 2020 Tom Bannink, Arash Bakhtiari, Adam Hillier, Lukas Geiger, Tim de Bruin, Leon Overweel, Jelmer Neeven, Koen Helwegen

We introduce Larq Compute Engine, the world's fastest Binarized Neural Network (BNN) inference engine, and use this framework to investigate several important questions about the efficiency of BNNs and to design a new state-of-the-art BNN architecture.

Latent Weights Do Not Exist: Rethinking Binarized Neural Network Optimization

3 code implementations NeurIPS 2019 Koen Helwegen, James Widdicombe, Lukas Geiger, Zechun Liu, Kwang-Ting Cheng, Roeland Nusselder

Together, the redefinition of latent weights as inertia and the introduction of Bop enable a better understanding of BNN optimization and open up the way for further improvements in training methodologies for BNNs.

Generating and refining particle detector simulations using the Wasserstein distance in adversarial networks

no code implementations9 Feb 2018 Martin Erdmann, Lukas Geiger, Jonas Glombitza, David Schmidt

We use adversarial network architectures together with the Wasserstein distance to generate or refine simulated detector data.

Instrumentation and Methods for Astrophysics High Energy Physics - Experiment

Cannot find the paper you are looking for? You can Submit a new open access paper.