FedDig: Robust Federated Learning Using Data Digest to Represent Absent Clients

3 Oct 2022  ·  Chih-Fan Hsu, Ming-Ching Chang, Wei-Chao Chen ·

Federated Learning (FL) is a collaborative learning performed by a moderator that protects data privacy. Existing cross-silo FL solutions seldom address the absence of participating clients during training which can seriously degrade model performances, particularly for unbalanced and non-IID client data. We address this issue by generating secure data digests from the raw data and using them to guide model training at the FL moderator. The proposed FL with data digest (FedDig) framework can tolerate unexpected client absence while preserving data privacy. This is achieved by de-identifying digests by mixing and perturbing the encoded features of the raw data in the feature space. The feature perturbing is performed following the Laplace mechanism of Differential Privacy. We evaluate FedDig on EMNIST, CIFAR-10, and CIFAR-100 datasets. The results consistently outperform three baseline algorithms (FedAvg, FedProx, and FedNova) by large margins in multiple client absence scenarios.

PDF Abstract
No code implementations yet. Submit your code now

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here