Search Results for author: Kilian Fatras

Found 17 papers, 11 papers with code

Generating and Imputing Tabular Data via Diffusion and Flow-based Gradient-Boosted Trees

2 code implementations18 Sep 2023 Alexia Jolicoeur-Martineau, Kilian Fatras, Tal Kachman

Through empirical evaluation across the benchmark, we demonstrate that our approach outperforms deep-learning generation methods in data generation tasks and remains competitive in data imputation.

Imputation

Simulation-free Schrödinger bridges via score and flow matching

1 code implementation7 Jul 2023 Alexander Tong, Nikolay Malkin, Kilian Fatras, Lazar Atanackovic, Yanlei Zhang, Guillaume Huguet, Guy Wolf, Yoshua Bengio

We present simulation-free score and flow matching ([SF]$^2$M), a simulation-free objective for inferring stochastic dynamics given unpaired samples drawn from arbitrary source and target distributions.

Unbalanced Optimal Transport meets Sliced-Wasserstein

no code implementations12 Jun 2023 Thibault Séjourné, Clément Bonet, Kilian Fatras, Kimia Nadjahi, Nicolas Courty

In parallel, unbalanced OT was designed to allow comparisons of more general positive measures, while being more robust to outliers.

Diffusion models with location-scale noise

no code implementations12 Apr 2023 Alexia Jolicoeur-Martineau, Kilian Fatras, Ke Li, Tal Kachman

Diffusion Models (DMs) are powerful generative models that add Gaussian noise to the data and learn to remove it.

PopulAtion Parameter Averaging (PAPA)

1 code implementation6 Apr 2023 Alexia Jolicoeur-Martineau, Emy Gervais, Kilian Fatras, Yan Zhang, Simon Lacoste-Julien

Based on this idea, we propose PopulAtion Parameter Averaging (PAPA): a method that combines the generality of ensembling with the efficiency of weight averaging.

Improving and generalizing flow-based generative models with minibatch optimal transport

2 code implementations1 Feb 2023 Alexander Tong, Kilian Fatras, Nikolay Malkin, Guillaume Huguet, Yanlei Zhang, Jarrid Rector-Brooks, Guy Wolf, Yoshua Bengio

CFM features a stable regression objective like that used to train the stochastic flow in diffusion models but enjoys the efficient inference of deterministic flow models.

A Reproducible and Realistic Evaluation of Partial Domain Adaptation Methods

no code implementations3 Oct 2022 Tiago Salvador, Kilian Fatras, Ioannis Mitliagkas, Adam Oberman

In this work, we consider the Partial Domain Adaptation (PDA) variant, where we have extra source classes not present in the target domain.

Model Selection Partial Domain Adaptation +1

On making optimal transport robust to all outliers

no code implementations23 Jun 2022 Kilian Fatras

To decrease the influence of these outliers in the transport problem, we propose to either remove them from the problem or to increase the cost of moving them by using the classifier prediction.

Optimal transport meets noisy label robust loss and MixUp regularization for domain adaptation

no code implementations22 Jun 2022 Kilian Fatras, Hiroki Naganuma, Ioannis Mitliagkas

It is common in computer vision to be confronted with domain shift: images which have the same class but different acquisition conditions.

Domain Adaptation

Unbalanced minibatch Optimal Transport; applications to Domain Adaptation

2 code implementations5 Mar 2021 Kilian Fatras, Thibault Séjourné, Nicolas Courty, Rémi Flamary

Optimal transport distances have found many applications in machine learning for their capacity to compare non-parametric probability distributions.

Domain Adaptation

Minibatch optimal transport distances; analysis and applications

2 code implementations5 Jan 2021 Kilian Fatras, Younes Zine, Szymon Majewski, Rémi Flamary, Rémi Gribonval, Nicolas Courty

We notably argue that the minibatch strategy comes with appealing properties such as unbiased estimators, gradients and a concentration bound around the expectation, but also with limits: the minibatch OT is not a distance.

Generating Natural Adversarial Hyperspectral examples with a modified Wasserstein GAN

no code implementations27 Jan 2020 Jean-Christophe Burnel, Kilian Fatras, Nicolas Courty

In this paper, we present a new method which is able to generate natural adversarial examples from the true data following the second paradigm.

Learning with minibatch Wasserstein : asymptotic and gradient properties

3 code implementations9 Oct 2019 Kilian Fatras, Younes Zine, Rémi Flamary, Rémi Gribonval, Nicolas Courty

Optimal transport distances are powerful tools to compare probability distributions and have found many applications in machine learning.

Wasserstein Adversarial Regularization (WAR) on label noise

1 code implementation8 Apr 2019 Kilian Fatras, Bharath Bhushan Damodaran, Sylvain Lobry, Rémi Flamary, Devis Tuia, Nicolas Courty

Noisy labels often occur in vision datasets, especially when they are obtained from crowdsourcing or Web scraping.

Semantic Segmentation

Variance Reduced Three Operator Splitting

1 code implementation19 Jun 2018 Fabian Pedregosa, Kilian Fatras, Mattia Casotto

This is due to the fact that existing methods require to evaluate the proximity operator for the nonsmooth terms, which can be a costly operation for complex penalties.

Optimization and Control 65K10

Cannot find the paper you are looking for? You can Submit a new open access paper.