Search Results for author: Eunjeong Jeong

Found 6 papers, 0 papers with code

Personalized Decentralized Federated Learning with Knowledge Distillation

no code implementations23 Feb 2023 Eunjeong Jeong, Marios Kountouris

To cope with this issue, we propose a personalized and fully decentralized FL algorithm, leveraging knowledge distillation techniques to empower each device so as to discern statistical distances between local models.

Federated Learning Knowledge Distillation

Asynchronous Decentralized Learning over Unreliable Wireless Networks

no code implementations2 Feb 2022 Eunjeong Jeong, Matteo Zecchin, Marios Kountouris

Decentralized learning enables edge users to collaboratively train models by exchanging information via device-to-device communication, yet prior works have been limited to wireless networks with fixed topologies and reliable workers.

Mix2FLD: Downlink Federated Learning After Uplink Federated Distillation With Two-Way Mixup

no code implementations17 Jun 2020 Seungeun Oh, Jihong Park, Eunjeong Jeong, Hyesung Kim, Mehdi Bennis, Seong-Lyun Kim

This letter proposes a novel communication-efficient and privacy-preserving distributed machine learning framework, coined Mix2FLD.

Federated Learning Privacy Preserving

Multi-hop Federated Private Data Augmentation with Sample Compression

no code implementations15 Jul 2019 Eunjeong Jeong, Seungeun Oh, Jihong Park, Hyesung Kim, Mehdi Bennis, Seong-Lyun Kim

On-device machine learning (ML) has brought about the accessibility to a tremendous amount of data from the users while keeping their local data private instead of storing it in a central entity.

Data Augmentation

Cannot find the paper you are looking for? You can Submit a new open access paper.