no code implementations • 17 Jul 2023 • Jun Nie, Danyang Xiao, Lei Yang, Weigang Wu
This can alleviate the performance degradation on the aggregated global model.
1 code implementation • 31 Dec 2020 • Binbin Guo, Yuan Mei, Danyang Xiao, Weigang Wu, Ye Yin, Hongli Chang
To achieve model personalization while maintaining generalization, in this paper, we propose a new approach, named PFL-MoE, which mixes outputs of the personalized model and global model via the MoE architecture.