no code implementations • 31 Mar 2024 • Yassir Bendou, Giulia Lioi, Bastien Pasdeloup, Lukas Mauch, Ghouthi Boukli Hacene, Fabien Cardinaux, Vincent Gripon
In this setting, only the label of the target class is available, and the goal is to discriminate between positive and negative query samples without requiring any validation example from the target task.
1 code implementation • 20 Jan 2024 • Reda Bensaid, Vincent Gripon, François Leduc-Primeau, Lukas Mauch, Ghouthi Boukli Hacene, Fabien Cardinaux
In recent years, the rapid evolution of computer vision has seen the emergence of various foundation models, each tailored to specific data types and tasks.
1 code implementation • 24 Nov 2023 • Yassir Bendou, Vincent Gripon, Bastien Pasdeloup, Giulia Lioi, Lukas Mauch, Fabien Cardinaux, Ghouthi Boukli Hacene
In this paper, we present a novel approach that leverages text-derived statistics to predict the mean and covariance of the visual feature distribution for each class.
1 code implementation • 30 Sep 2023 • Yihang Chen, Lukas Mauch
To address these issues, we propose Order-Preserving GFlowNets (OP-GFNs), which sample with probabilities in proportion to a learned reward function that is consistent with a provided (partial) order on the candidates, thus eliminating the need for an explicit formulation of the reward function.
no code implementations • 7 Sep 2023 • Pau Mulet Arabi, Alec Flowers, Lukas Mauch, Fabien Cardinaux
Computing gradients of an expectation with respect to the distributional parameters of a discrete distribution is a problem arising in many fields of science and engineering.
1 code implementation • 23 Apr 2023 • Bac Nguyen, Lukas Mauch
Deep equilibrium models (DEQs) have proven to be very powerful for learning data representations.
1 code implementation • 13 Dec 2022 • Yassir Bendou, Vincent Gripon, Bastien Pasdeloup, Lukas Mauch, Stefan Uhlich, Fabien Cardinaux, Ghouthi Boukli Hacene, Javier Alonso Garcia
Such a set is hardly available in few-shot learning scenarios, a highly disregarded shortcoming in the field.
no code implementations • 24 Mar 2021 • Ghouthi Boukli Hacene, Lukas Mauch, Stefan Uhlich, Fabien Cardinaux
We call this procedure \textit{DNN Quantization with Attention} (DQA).
1 code implementation • 12 Feb 2021 • Takuya Narihira, Javier Alonsogarcia, Fabien Cardinaux, Akio Hayakawa, Masato Ishii, Kazunori Iwaki, Thomas Kemp, Yoshiyuki Kobayashi, Lukas Mauch, Akira Nakamura, Yukio Obuchi, Andrew Shin, Kenji Suzuki, Stephen Tiedmann, Stefan Uhlich, Takuya Yashima, Kazuki Yoshiyama
While there exist a plethora of deep learning tools and frameworks, the fast-growing complexity of the field brings new demands and challenges, such as more flexible network design, speedy computation on distributed setting, and compatibility between different tools.
no code implementations • 24 Nov 2020 • Lukas Mauch, Stephen Tiedemann, Javier Alonso Garcia, Bac Nguyen Cong, Kazuki Yoshiyama, Fabien Cardinaux, Thomas Kemp
Usually, we compute the proxy for all DNNs in the network search space and pick those that maximize the proxy as candidates for optimization.
no code implementations • NIPS Workshop CDNNRIA 2018 • Fabien Cardinaux, Stefan Uhlich, Kazuki Yoshiyama, Javier Alonso Garcia, Lukas Mauch, Stephen Tiedemann, Thomas Kemp, Akira Nakamura
For each layer, we learn a value dictionary and an assignment matrix to represent the network weights.
2 code implementations • ICLR 2020 • Stefan Uhlich, Lukas Mauch, Fabien Cardinaux, Kazuki Yoshiyama, Javier Alonso Garcia, Stephen Tiedemann, Thomas Kemp, Akira Nakamura
Since choosing the optimal bitwidths is not straight forward, training methods, which can learn them, are desirable.
no code implementations • ICLR 2019 • Alexander Bartler, Felix Wiewel, Bin Yang, Lukas Mauch
In this paper, we propose an easy method to train VAEs with binary or categorically valued latent representations.
no code implementations • 23 Oct 2018 • Lukas Mauch, Bin Yang
Deep neural networks (DNN) are powerful models for many pattern recognition tasks, yet their high computational complexity and memory requirement limit them to applications on high-performance computing platforms.
no code implementations • 25 Jun 2018 • Thomas Küstner, Sergios Gatidis, Annika Liebgott, Martin Schwartz, Lukas Mauch, Petros Martirosian, Holger Schmidt, Nina F. Schwenzer, Konstantin Nikolaou, Fabian Bamberg, Bin Yang, Fritz Schick
Therefore, the assessment or the ensurance of sufficient image quality in an automated manner is of high interest.
no code implementations • 20 Feb 2018 • Karim Said Barsim, Lukas Mauch, Bin Yang
The problem of identifying end-use electrical appliances from their individual consumption profiles, known as the appliance identification problem, is a primary stage in both Non-Intrusive Load Monitoring (NILM) and automated plug-wise metering.