no code implementations • 25 Jan 2024 • Huy Q. Le, Chu Myaet Thwal, Yu Qiao, Ye Lin Tun, Minh N. H. Nguyen, Choong Seon Hong
In this paper, we propose Multimodal Federated Cross Prototype Learning (MFCPL), a novel approach for MFL under severely missing modalities by conducting the complete prototypes to provide diverse modality knowledge in modality-shared level with the cross-modal regularization and modality-specific level with cross-modal contrastive mechanism.
no code implementations • 22 Jan 2024 • Chu Myaet Thwal, Minh N. H. Nguyen, Ye Lin Tun, Seong Tae Kim, My T. Thai, Choong Seon Hong
Federated learning (FL) has emerged as a promising approach to collaboratively train machine learning models across multiple edge devices while preserving privacy.
no code implementations • 22 Jan 2024 • Chu Myaet Thwal, Kyi Thar, Ye Lin Tun, Choong Seon Hong
Thus, our objective is to provide a personalized clinical decision support system with evolvable characteristics that can deliver accurate solutions and assist healthcare professionals in medical diagnosing.
no code implementations • 22 Jan 2024 • Ye Lin Tun, Chu Myaet Thwal, Le Quang Huy, Minh N. H. Nguyen, Choong Seon Hong
Many recent studies integrate federated learning (FL) with self-supervised learning (SSL) to take advantage of raw training data distributed across edge devices.
no code implementations • 22 Jan 2024 • Chu Myaet Thwal, Ye Lin Tun, Kitae Kim, Seong-Bae Park, Choong Seon Hong
Recent innovations in transformers have shown their superior performance in natural language processing (NLP) and computer vision (CV).
1 code implementation • 28 Nov 2023 • Ye Lin Tun, Chu Myaet Thwal, Ji Su Yoon, Sun Moo Kang, Chaoning Zhang, Choong Seon Hong
We conduct experiments on various FL scenarios, and our findings demonstrate that federated diffusion models have great potential to deliver vision services to privacy-sensitive domains.
no code implementations • 28 Nov 2023 • Ye Lin Tun, Minh N. H. Nguyen, Chu Myaet Thwal, Jinwoo Choi, Choong Seon Hong
Together, self-supervised pre-training and client clustering can be crucial components for tackling the data heterogeneity issues of FL.
no code implementations • 20 Oct 2023 • Loc X. Nguyen, Huy Q. Le, Ye Lin Tun, Pyae Sone Aung, Yan Kyaw Tun, Zhu Han, Choong Seon Hong
Semantic communication has emerged as a pillar for the next generation of communication systems due to its capabilities in alleviating data redundancy.
no code implementations • 21 Mar 2023 • Chaoning Zhang, Chenshuang Zhang, Sheng Zheng, Yu Qiao, Chenghao Li, Mengchun Zhang, Sumit Kumar Dam, Chu Myaet Thwal, Ye Lin Tun, Le Luang Huy, Donguk Kim, Sung-Ho Bae, Lik-Hang Lee, Yang Yang, Heng Tao Shen, In So Kweon, Choong Seon Hong
As ChatGPT goes viral, generative AI (AIGC, a. k. a AI-generated content) has made headlines everywhere because of its ability to analyze and create text, images, and beyond.
no code implementations • 28 Oct 2022 • Ye Lin Tun, Kyi Thar, Chu Myaet Thwal, Choong Seon Hong
In this paper, we propose a recurrent neural network based energy demand predictor, trained with federated learning on clustered clients to take advantage of distributed data and speed up the convergence process.
1 code implementation • 28 Oct 2022 • Ye Lin Tun, Chu Myaet Thwal, Yu Min Park, Seong-Bae Park, Choong Seon Hong
Specifically, FedIntR computes a regularization term that encourages the closeness between the intermediate layer representations of the local and global models.