no code implementations • 23 Feb 2023 • Eunjeong Jeong, Marios Kountouris
To cope with this issue, we propose a personalized and fully decentralized FL algorithm, leveraging knowledge distillation techniques to empower each device so as to discern statistical distances between local models.
no code implementations • 2 Feb 2022 • Eunjeong Jeong, Matteo Zecchin, Marios Kountouris
Decentralized learning enables edge users to collaboratively train models by exchanging information via device-to-device communication, yet prior works have been limited to wireless networks with fixed topologies and reliable workers.
no code implementations • 17 Jun 2020 • Seungeun Oh, Jihong Park, Eunjeong Jeong, Hyesung Kim, Mehdi Bennis, Seong-Lyun Kim
This letter proposes a novel communication-efficient and privacy-preserving distributed machine learning framework, coined Mix2FLD.
no code implementations • 16 Aug 2019 • Jihong Park, Shiqiang Wang, Anis Elgabli, Seungeun Oh, Eunjeong Jeong, Han Cha, Hyesung Kim, Seong-Lyun Kim, Mehdi Bennis
Devices at the edge of wireless networks are the last mile data sources for machine learning (ML).
no code implementations • 15 Jul 2019 • Eunjeong Jeong, Seungeun Oh, Jihong Park, Hyesung Kim, Mehdi Bennis, Seong-Lyun Kim
On-device machine learning (ML) has brought about the accessibility to a tremendous amount of data from the users while keeping their local data private instead of storing it in a central entity.
no code implementations • 28 Nov 2018 • Eunjeong Jeong, Seungeun Oh, Hyesung Kim, Jihong Park, Mehdi Bennis, Seong-Lyun Kim
On-device machine learning (ML) enables the training process to exploit a massive amount of user-generated private data samples.