Search Results for author: Othmane Marfoq

Found 7 papers, 6 papers with code

A Cautionary Tale: On the Role of Reference Data in Empirical Privacy Defenses

no code implementations18 Oct 2023 Caelin G. Kaplan, Chuan Xu, Othmane Marfoq, Giovanni Neglia, Anderson Santana de Oliveira

Within the realm of privacy-preserving machine learning, empirical privacy defenses have been proposed as a solution to achieve satisfactory levels of training data privacy without a significant drop in model utility.

Privacy Preserving

Federated Learning under Heterogeneous and Correlated Client Availability

1 code implementation11 Jan 2023 Angelo Rodio, Francescomaria Faticanti, Othmane Marfoq, Giovanni Neglia, Emilio Leonardi

To this purpose, CA-Fed dynamically adapts the weight given to each client and may ignore clients with low availability and large correlation.

Federated Learning

Federated Learning for Data Streams

1 code implementation4 Jan 2023 Othmane Marfoq, Giovanni Neglia, Laetitia Kameni, Richard Vidal

Federated learning (FL) is an effective solution to train machine learning models on the increasing amount of data generated by IoT devices and smartphones while keeping such data localized.

Federated Learning

Personalized Federated Learning through Local Memorization

2 code implementations17 Nov 2021 Othmane Marfoq, Giovanni Neglia, Laetitia Kameni, Richard Vidal

Federated learning allows clients to collaboratively learn statistical models while keeping their data local.

Binary Classification Fairness +3

Federated Multi-Task Learning under a Mixture of Distributions

4 code implementations NeurIPS 2021 Othmane Marfoq, Giovanni Neglia, Aurélien Bellet, Laetitia Kameni, Richard Vidal

The increasing size of data generated by smartphones and IoT devices motivated the development of Federated Learning (FL), a framework for on-device collaborative training of machine learning models.

Fairness Multi-Task Learning +1

Throughput-Optimal Topology Design for Cross-Silo Federated Learning

1 code implementation NeurIPS 2020 Othmane Marfoq, Chuan Xu, Giovanni Neglia, Richard Vidal

Federated learning usually employs a client-server architecture where an orchestrator iteratively aggregates model updates from remote clients and pushes them back a refined model.

Federated Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.