no code implementations • 25 Dec 2023 • Vincent Plassier, Nikita Kotelevskii, Aleksandr Rubashevskii, Fedor Noskov, Maksim Velikanov, Alexander Fishkov, Samuel Horvath, Martin Takac, Eric Moulines, Maxim Panov
Conformal Prediction (CP) stands out as a robust framework for uncertainty quantification, which is crucial for ensuring the reliability of predictions.
no code implementations • 8 Jun 2023 • Vincent Plassier, Mehdi Makni, Aleksandr Rubashevskii, Eric Moulines, Maxim Panov
Federated Learning (FL) is a machine learning framework where many clients collaboratively train models while keeping the training data decentralized.
no code implementations • 31 Oct 2022 • Vincent Plassier, Alain Durmus, Eric Moulines
This paper focuses on Bayesian inference in a federated learning context (FL).
no code implementations • 27 Jul 2022 • Hamid Jalalzai, Elie Kadoche, Rémi Leluc, Vincent Plassier
In this paper, we develop a mean to measure the leakage of training data leveraging a quantity appearing as a proxy of the total variation of a trained model near its training samples.
no code implementations • 11 Jun 2021 • Vincent Plassier, Maxime Vono, Alain Durmus, Eric Moulines
Performing reliable Bayesian inference on a big data scale is becoming a keystone in the modern era of machine learning.
no code implementations • 1 Jun 2021 • Maxime Vono, Vincent Plassier, Alain Durmus, Aymeric Dieuleveut, Eric Moulines
The objective of Federated Learning (FL) is to perform statistical inference for data which are decentralised and stored locally on networked clients.
no code implementations • 16 Jun 2020 • Vincent Plassier, François Portier, Johan Segers
Consider the problem of learning a large number of response functions simultaneously based on the same input variables.