no code implementations • 5 Dec 2022 • Abdullah Basar Akbay, Cihan Tepedelenlioglu
During each round, each agent samples a minibatch of training data and sends his gradient update.
no code implementations • NeurIPS 2021 • Abdullah Basar Akbay, Junshan Zhang
We consider a distributed learning setting where strategic users are incentivized, by a cost-sensitive fusion center, to train a learning model based on local data.