Federated Learning
1235 papers with code • 12 benchmarks • 11 datasets
Federated Learning is a machine learning approach that allows multiple devices or entities to collaboratively train a shared model without exchanging their data with each other. Instead of sending data to a central server for training, the model is trained locally on each device, and only the model updates are sent to the central server, where they are aggregated to improve the shared model.
This approach allows for privacy-preserving machine learning, as each device keeps its data locally and only shares the information needed to improve the model.
Libraries
Use these libraries to find Federated Learning models and implementationsDatasets
Latest papers
FLEX: FLEXible Federated Learning Framework
In the realm of Artificial Intelligence (AI), the need for privacy and security in data processing has become paramount.
pfl-research: simulation framework for accelerating research in Private Federated Learning
Federated learning (FL) is an emerging machine learning (ML) training paradigm where clients own their data and collaborate to train a global model, without revealing any data to the server and other participants.
Approximate Gradient Coding for Privacy-Flexible Federated Learning with Non-IID Data
This work focuses on the challenges of non-IID data and stragglers/dropouts in federated learning.
Computation and Communication Efficient Lightweighting Vertical Federated Learning
Moreover, we establish a convergence bound for our LVFL algorithm, which accounts for both communication and computational lightweighting ratios.
Stragglers-Aware Low-Latency Synchronous Federated Learning via Layer-Wise Model Updates
Synchronous federated learning (FL) is a popular paradigm for collaborative edge learning.
Empowering Data Mesh with Federated Learning
To the best of our knowledge, this is the first open-source applied work that represents a critical advancement toward the integration of federated learning methods into the Data Mesh paradigm, underscoring the promising prospects for privacy-preserving and decentralized data analysis strategies within Data Mesh architecture.
An Upload-Efficient Scheme for Transferring Knowledge From a Server-Side Pre-trained Generator to Clients in Heterogeneous Federated Learning
Heterogeneous Federated Learning (HtFL) enables collaborative learning on multiple clients with different model architectures while preserving privacy.
TablePuppet: A Generic Framework for Relational Federated Learning
In this paper, we formalize this problem as relational federated learning (RFL).
Initialisation and Topology Effects in Decentralised Federated Learning
Fully decentralised federated learning enables collaborative training of individual machine learning models on distributed devices on a network while keeping the training data localised.
Text-Enhanced Data-free Approach for Federated Class-Incremental Learning
In this field, Data-Free Knowledge Transfer (DFKT) plays a crucial role in addressing catastrophic forgetting and data privacy problems.