no code implementations • 15 Mar 2024 • Marc Lafon, Alexandre Thomas
Combining empirical risk minimization with capacity control is a classical strategy in machine learning when trying to control the generalization gap and avoid overfitting, as the model class capacity gets larger.
no code implementations • 15 Mar 2024 • Marc Lafon, Clément Rambour, Nicolas Thome
In this work, we study the out-of-distribution (OOD) detection problem through the use of the feature space of a pre-trained deep classifier.
Out-of-Distribution Detection Out of Distribution (OOD) Detection
1 code implementation • 26 May 2023 • Marc Lafon, Elias Ramzi, Clément Rambour, Nicolas Thome
HEAT complements prior density estimators of the ID density, e. g. parametric models like the Gaussian Mixture Model (GMM), to provide an accurate yet robust density estimation.
no code implementations • 29 Sep 2021 • Charles Corbière, Marc Lafon, Nicolas Thome, Matthieu Cord, Patrick Perez
A crucial property of KLoS is to be a class-wise divergence measure built from in-distribution samples and to not require OOD training data, in contrast to current second-order uncertainty measures.